Guest Article: The Causes of Digital Patient Privacy Loss in EHRs and Other Health IT Systems

Check out the latest from Shahid Shah, courtesy of The Healthcare IT Guy.

This past Friday I was invited by the Patient Privacy Rights (PPR) Foundation to lead a discussion about privacy and EHRs. The discussion, entitled “Fact vs. Fiction: Best Privacy Practices for EHRs in the Cloud,” addressed patient privacy concerns and potential solutions for doctors working with EHRs.

While we are all somewhat disturbed by the slow erosion of privacy in all aspects of our digital lives, the rather rapid loss of patient privacy around health data is especially unnerving because healthcare is so near and dear to us all. In order to make sure we provided some actionable intelligence during the PPR discussion, I started the talk off giving some of the reasons why we’re losing patient privacy in the hopes that it might foster innovators to think about ways of slowing down inevitable losses.

Here are some of the causes I mentioned on Friday, not in any particular order:

  • Most patients, even technically astute ones, don’t really understand the concept of digital privacy. Digital is a “cyber world” and not easy to picture so patients believe their data and privacy is protected when it may not be. I usually explain patient privacy in the digital world to non-techies using the analogy of curtains, doors, and windows. The digital health IT world of today is like walking into a patient’s room in a hospital in which it’s a large shared space with no curtains, no walls, no doors, etc. (even for bathrooms or showers!). In this imaginary world, every private conversation occurs so that others can hear it, all procedures are performed in front of others, etc. without the patient’s consent and their objections don’t even matter. If they can imagine that scenario, then patients will probably have a good idea about how digital privacy is conducted today — a big shared room where everyone sees and hears everything even over patients’ objections.
  • It’s faster and easier to create non-privacy-aware IT solutions than privacy-aware ones. Having built dozens of HIPAA-compliant and highly secure enterprise health IT systems for decades, my anecdotal experience is that when it comes to features and functions vs. privacy, features win. Product designers, architects, and engineers talk the talk but given the difficulties of creating viable systems in a coordinated, integrated digital ecosystem it’s really hard to walk the privacy walk  Because digital privacy is so hard to describe even in simple single enterprise systems, the difficulty of describing and defining it across multiple integrated systems is often the reason for poor privacy features in modern systems.
  • It’s less expensive to create non-privacy-aware IT solutions. Because designing privacy into the software from the beginning is hard and requires expensive security resources to do so, we often see developers wait until the end of the process to consider privacy. Privacy can no more be added on top of an existing system than security can — either it’s built into the functionality or it’s just going to be missing. Because it’s cheaper to leave it out, it’s often left out.
  • The government is incentivizing and certifying functionality over privacy and security. All the meaningful use certification and testing steps are focused too much on prescribed functionality and not enough on data-centric privacy capabilities such as notifications, disclosure tracking, and compartmentalization. If privacy was important in EHRs then the NIST test plans would cover that. Privacy is difficult to define and even more difficult to implement so the testing process doesn’t focus on it at this time.
  • Business models that favor privacy loss tend to be more profitable. Data aggregation and homogenization, resale, secondary use, and related business models tend to be quite profitable. The only way they will remain profitable is to have easy and unfettered (low friction) ways of sharing and aggregating data. Because enhanced privacy through opt-in processes, disclosures, and notifications would end up reducing data sharing and potentially reducing revenues and profit, we see that privacy loss is going to happen with inevitable rise of EHRs.
  • Patients don’t really demand privacy from their providers or IT solutions in the same way they demand other things. We like to think that all patients demand digital privacy for their data. However, it’s rare for patients to choose physicians, health systems, or other care providers based on their privacy views. Even when privacy violations are found and punished, it’s uncommon for patients to switch to other providers.
  • Regulations like HIPAA have made is easy for privacy loss to occur. HIPAA has probably done more to harm privacy over the past decade than any other government regulations. More on this in a later post.

The only way to improve privacy across the digital spectrum is to realize that health providers need to conduct business in a tricky intermediary-driven health system with sometimes conflicting business goals like reduction of medical errors or lower cost (which can only come with more data sharing, not less). Digital patient privacy is important but there are many valid reasons why privacy is either hard or impossible to achieve in today’s environment. Unless we intelligently and honestly understand why we lose patient privacy we can’t really create novel and unique solutions to help curb the loss.

What do you think? What other causes of digital patient privacy loss would you add to my list above?

Courtesy of The Healthcare IT Guy.

3 reasons for the demise of patient privacy

By Dan Bowman from FierceHealthIT

Several factors have contributed to the demise of patient privacy in recent years, according to software analyst and healthcare blogger Shahid Shah (a.k.a., The Health IT Guy).

For example, Shah said at a recent discussion hosted by the Patient Privacy Rights Foundation on the best privacy practices for electronic health records in the cloud, patients tend to not “demand” privacy as the cost of doing business with providers.

“It’s rare for patients to choose physicians, health systems or other care providers based on their privacy views,” Shah said in a blog post summarizing thoughts he shared at the event. “Even when privacy violations are found and punished, it’s uncommon for patients to switch to other providers.”

To view the full article visit 3 reasons for the demise of patient privacy

 

Proposed Rules Prevent Patient Control Over Sensitive Information in Electronic Health Records (EHRs)

The proposed federal rules will require physicians and hospitals to use Electronic Health Records (EHRs) that prevent patient control over who can see and use sensitive personal health information.

This is the second time the federal government has proposed the use of technology that violates Americans’ strong rights to control the use and sale of their most sensitive personal information, from DNA to prescription records to diagnoses.

The proposed rules require EHRs to be able to show “meaningful use” (MU) and exchange of personal health data. PPR and other consumer and privacy advocacy groups submitted similar comments for the Stage 1 MU rules. These newly proposed rules are known as “Stage 2 MU” requirements for EHRs.

The most important function patients expect from electronic health systems is the power to control who can see and use their most sensitive personal information. Technologies that empower patients to decide who can see and use selected parts of their records have been working for 4 million people for over 10 years in 8 states with mental illness or addiction diagnoses. Today we do not have any way to know where our data flows, or who is using and selling it.

Even if we had a ‘chain of custody’ to prove who saw, used, or sold our personal health data—which we do not—it is still essential to restore patient control over personal health data so we can trust electronic health systems.

Technologies that require patient consent before data flows are cheap, effective, and should be required in all EHRs.

See Patient Privacy Rights’ formal comments on the Stage 2 MU proposed requirements submitted to the Centers for Medicare and Medicaid and the Office of the National Coordinator for Health IT at: http://patientprivacyrights.org/wp-content/uploads/2012/05/PPR-Comments-for-Stage-2MU-5-7-12.pdf

They got it wrong… AGAIN!

See article: ‘Meaningful Use’ criteria released

Can you believe it? Doctors and hospitals that purchase electronic health records (EHRs) ‘wired’ for ‘back-door’ data mining will be paid to steal and use our sensitive health records without our permission!

The government and the massive health data mining industry won. Industry and the government’s plan to continue illegal and unethical data mining trumped Americans’ rights to health privacy.

The rules guarantee that employers, insurers, banks, and government will be able to use our sensitive health information—from prescriptions to DNA— to discriminate against us in jobs, credit, and insurance.

Instead, the new interim rules for EHRs should reward the purchase and use of ‘smart’ EHRs with consent technologies so patients control who can see and use their health records.

The stimulus billions will be wasted because doctors and hospitals will be rewarded for using obsolete, unethical EHR ‘clunkers’. Like the UK, the US will be forced to spend billions to correct a disastrously flawed national electronic health system that prevents patients from controlling their health records.

To understand the “meaningful use” criteria that SHOULD be required in EHRs, see the comments submitted to the Administration by the bipartisan Coalition for Patient Privacy, representing millions of Americans: http://www.localhost:8888/pprold/media/Coalition_to_HIT_PC_Meaningful_Use.pdf

When will the Administration and corporations get it? Privacy protections have to be tough and comprehensive if we want a national HIT system that consumers will trust and use.

To act, join www.localhost:8888/pprold to get e-alerts. Stop corporations and the government from using your sensitive health information for uses you would never agree to.