3 Reasons Your Medical Records Are at Risk

When hospitals find themselves in the middle of a breach, they usually prioritize improving their security to prevent further security breach incidents.

In addition to defending themselves against data breaches, health systems also need to find the right balance to adequately protect their patients’ privacy.

Since medical information is stored digitally, patients may not be fully aware how crucial it is to protect their data from being seen by unauthorized persons. Some privacy breaches may be avoidable, and learning from these mistakes is essential for health systems to maintain security of sensitive patient information. Here are three reasons why patient security may be lacking at health organizations.

Privacy Is on the Back Burner

When health IT systems are built, ensuring patient privacy is usually not on the forefront of designers’ and engineers’ minds. These IT experts usually put system functions ahead of privacy, which could result in poor privacy protection down the road. Some developers may also leave out privacy features altogether, which could put patient information at risk for being compromised.

Human Error

In a recent report, psychiatric facilities in Texas suffered a string of data breaches, but the majority of them were caused by human error, The Republic reported.

Deborah Peel, the Austin founder of watchdog group Patient Privacy Rights, said repeated data breach incidents could lead patients to question whether their information is secure, which could cultivate distrust among patients. “Our patients deserve privacy and expect that their information is kept confidential,” said Christine Mann, spokeswoman for the Texas Department of State Health Services.

To view the full article please visit: 3 Reasons Your Medical Records Are at Risk

Petition for OSTP to Conduct Public Comment Process on Big Data and the Future of Privacy

February 10, 2013

Patient Privacy Rights, joined by EPIC, ACLU, Center for Democracy & Technology, EFF and 24 other consumer privacy and public interest organizations asked the White House’s Office of Science and Technology Policy to issue a Request for Information in order to conduct a review that incorporates the concerns and opinions of those whose data may be collected in bulk as a result of their engagement with technology.

“We believe that the public policy considerations arising from big data and privacy are issues of national concerns that ‘require the attention at the highest levels of Government.’”

The Coalition for Patient Privacy believes that the “OSTP should consider a broad range of big data privacy issues, including but not limited to:
(1) What potential harms arise from big data collection and how are these risks currently addressed?
(2) What are the legal frameworks currently governing big data, and are they adequate?
(3) How could companies and government agencies be more transparent in the use of big data, for example, by publishing algorithms?
(4) What technical measures could promote the benefits of big data while minimizing the privacy risks?
(5) What experience have other countries had trying to address the challenges of big data?
(6) What future trends concerning big data could inform the current debate?”

For more information, see EPIC, Coalition Urge White House to Listen to Public on “Big Data and Privacy”

To view a copy of the letter, please visit Petition for OSTP to Conduct Public Comment Process on Big Data and the Future of Privacy

Guest Article: The Causes of Digital Patient Privacy Loss in EHRs and Other Health IT Systems

Check out the latest from Shahid Shah, courtesy of The Healthcare IT Guy.

This past Friday I was invited by the Patient Privacy Rights (PPR) Foundation to lead a discussion about privacy and EHRs. The discussion, entitled “Fact vs. Fiction: Best Privacy Practices for EHRs in the Cloud,” addressed patient privacy concerns and potential solutions for doctors working with EHRs.

While we are all somewhat disturbed by the slow erosion of privacy in all aspects of our digital lives, the rather rapid loss of patient privacy around health data is especially unnerving because healthcare is so near and dear to us all. In order to make sure we provided some actionable intelligence during the PPR discussion, I started the talk off giving some of the reasons why we’re losing patient privacy in the hopes that it might foster innovators to think about ways of slowing down inevitable losses.

Here are some of the causes I mentioned on Friday, not in any particular order:

  • Most patients, even technically astute ones, don’t really understand the concept of digital privacy. Digital is a “cyber world” and not easy to picture so patients believe their data and privacy is protected when it may not be. I usually explain patient privacy in the digital world to non-techies using the analogy of curtains, doors, and windows. The digital health IT world of today is like walking into a patient’s room in a hospital in which it’s a large shared space with no curtains, no walls, no doors, etc. (even for bathrooms or showers!). In this imaginary world, every private conversation occurs so that others can hear it, all procedures are performed in front of others, etc. without the patient’s consent and their objections don’t even matter. If they can imagine that scenario, then patients will probably have a good idea about how digital privacy is conducted today — a big shared room where everyone sees and hears everything even over patients’ objections.
  • It’s faster and easier to create non-privacy-aware IT solutions than privacy-aware ones. Having built dozens of HIPAA-compliant and highly secure enterprise health IT systems for decades, my anecdotal experience is that when it comes to features and functions vs. privacy, features win. Product designers, architects, and engineers talk the talk but given the difficulties of creating viable systems in a coordinated, integrated digital ecosystem it’s really hard to walk the privacy walk  Because digital privacy is so hard to describe even in simple single enterprise systems, the difficulty of describing and defining it across multiple integrated systems is often the reason for poor privacy features in modern systems.
  • It’s less expensive to create non-privacy-aware IT solutions. Because designing privacy into the software from the beginning is hard and requires expensive security resources to do so, we often see developers wait until the end of the process to consider privacy. Privacy can no more be added on top of an existing system than security can — either it’s built into the functionality or it’s just going to be missing. Because it’s cheaper to leave it out, it’s often left out.
  • The government is incentivizing and certifying functionality over privacy and security. All the meaningful use certification and testing steps are focused too much on prescribed functionality and not enough on data-centric privacy capabilities such as notifications, disclosure tracking, and compartmentalization. If privacy was important in EHRs then the NIST test plans would cover that. Privacy is difficult to define and even more difficult to implement so the testing process doesn’t focus on it at this time.
  • Business models that favor privacy loss tend to be more profitable. Data aggregation and homogenization, resale, secondary use, and related business models tend to be quite profitable. The only way they will remain profitable is to have easy and unfettered (low friction) ways of sharing and aggregating data. Because enhanced privacy through opt-in processes, disclosures, and notifications would end up reducing data sharing and potentially reducing revenues and profit, we see that privacy loss is going to happen with inevitable rise of EHRs.
  • Patients don’t really demand privacy from their providers or IT solutions in the same way they demand other things. We like to think that all patients demand digital privacy for their data. However, it’s rare for patients to choose physicians, health systems, or other care providers based on their privacy views. Even when privacy violations are found and punished, it’s uncommon for patients to switch to other providers.
  • Regulations like HIPAA have made is easy for privacy loss to occur. HIPAA has probably done more to harm privacy over the past decade than any other government regulations. More on this in a later post.

The only way to improve privacy across the digital spectrum is to realize that health providers need to conduct business in a tricky intermediary-driven health system with sometimes conflicting business goals like reduction of medical errors or lower cost (which can only come with more data sharing, not less). Digital patient privacy is important but there are many valid reasons why privacy is either hard or impossible to achieve in today’s environment. Unless we intelligently and honestly understand why we lose patient privacy we can’t really create novel and unique solutions to help curb the loss.

What do you think? What other causes of digital patient privacy loss would you add to my list above?

Courtesy of The Healthcare IT Guy.

3 reasons for the demise of patient privacy

By Dan Bowman from FierceHealthIT

Several factors have contributed to the demise of patient privacy in recent years, according to software analyst and healthcare blogger Shahid Shah (a.k.a., The Health IT Guy).

For example, Shah said at a recent discussion hosted by the Patient Privacy Rights Foundation on the best privacy practices for electronic health records in the cloud, patients tend to not “demand” privacy as the cost of doing business with providers.

“It’s rare for patients to choose physicians, health systems or other care providers based on their privacy views,” Shah said in a blog post summarizing thoughts he shared at the event. “Even when privacy violations are found and punished, it’s uncommon for patients to switch to other providers.”

To view the full article visit 3 reasons for the demise of patient privacy

 

Report: State mental hospitals dealing with privacy breaches as patient records removed

AUSTIN, Texas — There have been five incidents in the last six months where patients’ health records have made their way out of some of Texas’ 10 public psychiatric facilities, according to a review of state records by a newspaper.

In one incident, an employee at Big Spring State Hospital in West Texas was fired after officials alleged she walked out of the facility with 50 patients’ protected health records, the Austin American-Statesman reported (http://bit.ly/1i0pZ2H ) Sunday.

In the other cases, which involved a total of about a dozen patients, officials determined that the breaches were caused by mistakes.

“This can’t happen,” said Christine Mann, spokeswoman for the Texas Department of State Health Services, which oversees the hospitals. “Our patients deserve privacy and expect that their information is kept confidential. We’re doing everything we can to figure out what happened and how to address it.”

Dr. Deborah Peel, the Austin founder of Patient Privacy Rights, a national watchdog group focused on the protection of medical records, said the multiple incidents at the Texas hospitals indicate a pattern of problems that raise questions about the hospital system’s ability to keep patient records safe.

“Incidents like this broadcast loud and clear that the place I go for help might not keep my information safe,” Peel said.

To view the full article, visit Report: State mental hospitals dealing with privacy breaches as patient records removed

Here’s Scary: Your Social Security Number Is Just a Click Away

From Nancy Smith of the Sunshine State News:

Snafus involving the mandated switch from paper to electronic medical records have been happening for the last few years as the Affordable Care Act geared up. Horror stories — like the one about a California orthopedic surgeon whose medical-records software provider sold his patients’ records to anybody who wanted them — are more common than most people realize. Read the incredible story.

“This is a nightmare. It’s nothing we’ve ever seen before in medicine,” said patient privacy-rights advocate Dr. Deborah Peel.

Peel said many patients and doctors don’t know the federal government quietly eliminated patients’ privacy rights for electronic records. “It’s a free-for-all,” she said. “It’s the Wild West. Today there are over 4 million different kinds of organizations and companies that can see and use our medical records without our knowledge, without our permission and we can’t refuse.”

Peel said we can actually thank Healthcare.gov, the Obamacare sign-up website, for waking us up and making us think about what happens to our personal health information on a big bureaucratic website.

All of a sudden, Americans get it, she said — and the Obama administration isn’t pleased at having to deal with another strain of negativity in the rollout of its health plan. The government, remember, spent some $2 billion just to encourage the adoption of electronic health records.

Peel, a physician and probably the most renowned national speaker on health privacy, believes Healthcare.gov will amount to government surveillance of all health information unless some mobile “app” is developed so patients can access and control the dispersal of their own data, with Social Security numbers at the top of the list.

“Health information is the most valuable personal data about you, bar none,” Peel said. “We (at Patientprivacyrights.org) tremendously support technology, but technology that’s smart, that serves you and does what you expect — that doesn’t serve hidden industries that steal data or (is subject to) government surveillance. Government technology could put us in much better control of our information.

“We need to develop a mobile ‘app’ that would let you find out what happens to your information We need new technology and privacy protections to be put in place.” See Peel’s remarks on Patientprivacyrights.org.

Please click here to read the full article.

Company That Knows What Drugs Everyone Takes Going Public

Nearly every time you fill out a prescription, your pharmacy sells details of the transaction to outside companies which compile and analyze the information to resell to others. The data includes age and gender of the patient, the name, address and contact details of their doctor, and details about the prescription.

A 60-year-old company little known by the public, IMS Health, is leading the way in gathering this data. They say they have assembled “85% of the world’s prescriptions by sales revenue and approximately 400 million comprehensive, longitudinal, anonymous patient records.”

IMS Health sells data and reports to all the top 100 worldwide global pharmaceutical and biotechnology companies, as well as consulting firms, advertising agencies, government bodies and financial firms. In a January 2nd filing to the Security and Exchange Commission announcing an upcoming IPO, IMS said it processes data from more 45 billion healthcare transactions annually (more than six for each human on earth on average) and collects information from more than 780,000 different streams of data worldwide.

Deborah Peel, a Freudian psychoanalyst who founded Patient Privacy Rights in Austin, Texas, has long been concerned about corporate gathering of medical records.

“I’ve spent 35 years or more listening to how people have been harmed because their records went somewhere they didn’t expect,” she says. “It got to employers who either fired them or demoted them or used the information to destroy their reputation.”

“It’s just not right. I saw massive discrimination in the paper age. Exponential isn’t even a big enough word for how far and how much the data is going to be used in the information age,” she continued. “If personal health data ‘belongs’ to anyone, surely it belongs to the individual, not to any corporation that handles, stores, or transmits that information.”

To view the full article please visit: Company That Knows What Drugs Everyone Takes Going Public

Testimony of Deborah C. Peel, MD at the ONC’s Patient Matching Stakeholder Meeting

WASHINGTON, DC (December 16, 2013) – Patient Privacy Rights’ (PPR) founder and chair, Deborah C. Peel, MD, submitted written testimony to the U.S. Department of Health and Human Services’ Office of the National Coordinator (ONC) at today’s Patient Matching Stakeholder Meeting. The meeting discussed the initial findings from the ONC’s dedicated initiative to assess which aspects of patient identification matching are working well, where there are gaps, and where improvements are needed.

 

In her prepared testimony, Dr. Peel said that “the Initial Findings address the problems caused by current institutional health information technology (health IT) systems and data exchanges.” However, she also stated that the findings may not adequately address future needs, nor do they foresee how the meaningful use requirements for the Health Information Technology for Clinical Health (HITECH) Act can resolve many of the current problems with patient identity and patient matching.

 

Arguing that the findings present a tremendous opportunity to create and leverage genuine patient engagement, Dr. Peel said that “patients have more interest and stake in data integrity and safety than any other stakeholder.” Describing PPR’s vision of the future, Dr. Peel outlined how meaningful patient engagement will eliminate many of the complex problems caused by current patient identity systems, matching technologies, and algorithms. She also said that meaningful patient engagement means that patients can access, control, or delegate how their personal information is used and disclosed, as well as monitor all exchanges of their health data in real time.

 

Additionally, Dr. Peel discussed key elements for meaningful patient engagement based on Fair Information Practices (FIPs) and federal law. She said that all data holders and all health data aggregators should operate as HIPAA covered entities and should be known to patients. In order to provide accountability and transparency, she said that each data aggregator should provide Notice of Privacy Practices (NPPs), voluntary patient-controlled IDs, patient and physician portals, Direct Secure email between patients and physicians Blue Button Plus (BB+), and real time accounting of disclosures.

 

In her concluding remarks, Dr. Peel stated that polices and best practices should consider how future health IT systems and data exchanges will operate, and should “anticipate meaningful patient and physician engagement, lowering costs, improving data quality, integrity and patient safety.” She urged the ONC to require, promote, and incentivize the rapid adoption of technologies that meaningfully engage patients as described in her testimony.
The complete text of this testimony is here.

Can we at least try not to kill 440,000 patients per year?

Check out the latest from Doc Searls, courtesy of Doc Searls Weblog.

Obamacare matters. But the debate about it also misdirects attention away from massive collateral damage to patients. How massive? Dig To Make Hospitals Less Deadly, a Dose of Data, by Tina Rosenberg in The New York Times. She writes,

Until very recently, health care experts believed that preventable hospital error caused some 98,000 deaths a year in the United States — a figure based on 1984 data. But a new report from the Journal of Patient Safety using updated data holds such error responsible for many more deaths — probably around some 440,000 per year. That’s one-sixth of all deaths nationally, making preventable hospital error the third leading cause of death in the United States. And 10 to 20 times that many people suffer nonlethal but serious harm as a result of hospital mistakes.

The bold-facing is mine. In 2003, one of those statistics was my mother. I too came close in 2008, though the mistake in that case wasn’t a hospital’s, but rather a consequence of incompatibility between different silo’d systems for viewing MRIs, and an ill-informed rush into a diagnostic procedure that proved unnecessary and caused pancreatitis (which happens in 5% of those performed — I happened to be that one in twenty). That event, my doctors told me, increased my long-term risk of pancreatic cancer.

Risk is the game we’re playing here: the weighing of costs and benefits, based on available information. Thus health care is primarily the risk-weighing business we call insurance. For generations, the primary customers of health care — the ones who pay for the services — have been insurance companies. Their business is selling bets on outcomes to us, to our employers, or both. They play that game, to a large extent, by knowing more than we do. Asymmetrical knowledge R them.

Now think about the data involved. Insurance companies live in a world of data. That world is getting bigger and bigger. And yet, McKinsey tells us, it’s not big enough. In The big-data revolution in US health care: Accelerating value and innovation (subtitle: Big data could transform the health-care sector, but the industry must undergo fundamental changes before stakeholders can capture its full value), McKinsey writes,

Fiscal concerns, perhaps more than any other factor, are driving the demand for big-data applications. After more than 20 years of steady increases, health-care expenses now represent 17.6 percent of GDP—nearly $600 billion more than the expected benchmark for a nation of the United States’s size and wealth.1 To discourage overutilization, many payors have shifted from fee-for-service compensation, which rewards physicians for treatment volume, to risk-sharing arrangements that prioritize outcomes. Under the new schemes, when treatments deliver the desired results, provider compensation may be less than before. Payors are also entering similar agreements with pharmaceutical companies and basing reimbursement on a drug’s ability to improve patient health. In this new environment, health-care stakeholders have greater incentives to compile and exchange information.

While health-care costs may be paramount in big data’s rise, clinical trends also play a role. Physicians have traditionally used their judgment when making treatment decisions, but in the last few years there has been a move toward evidence-based medicine, which involves systematically reviewing clinical data and making treatment decisions based on the best available information. Aggregating individual data sets into big-data algorithms often provides the most robust evidence, since nuances in subpopulations (such as the presence of patients with gluten allergies) may be so rare that they are not readily apparent in small samples.

Although the health-care industry has lagged behind sectors like retail and banking in the use of big data—partly because of concerns about patient confidentiality—it could soon catch up. First movers in the data sphere are already achieving positive results, which is prompting other stakeholders to take action, lest they be left behind. These developments are encouraging, but they also raise an important question: is the health-care industry prepared to capture big data’s full potential, or are there roadblocks that will hamper its use

The word “patient” appears nowhere in that long passage. The word “stakeholder” appears twice, plus eight more times in the whole piece. Still, McKinsey brooks some respect for the patient, though more as a metric zone than as a holder of a stake in outcomes:

Health-care stakeholders are well versed in capturing value and have developed many levers to assist with this goal. But traditional tools do not always take complete advantage of the insights that big data can provide. Unit-price discounts, for instance, are based primarily on contracting and negotiating leverage. And like most other well-established health-care value levers, they focus solely on reducing costs rather than improving patient outcomes. Although these tools will continue to play an important role, stakeholders will only benefit from big data if they take a more holistic, patient-centered approach to value, one that focuses equally on health-care spending and treatment outcomes.

McKinsey’s customers are not you and me. They are business executives, many of which work in health care. As players in their game, we have zero influence. As voters in the democracy game, however, we have a bit more. That’s one reason we elected Barack Obama.

So, viewed from the level at which it plays out, the debate over health care, at least in the U.S., is between those who believe in addressing problems with business (especially the big kind) and those who believe in addressing problems with policy (especially the big kind, such as Obamacare).

Big business has been winning, mostly. This is why Obamacare turned out to be a set of policy tweaks on a business that was already highly regulated, mostly by captive lawmakers and regulators.

Meanwhile we have this irony to contemplate: while dying of bad data at a rate rivaling war and plague, our physical bodies are being doubled into digital ones. It is now possible to know one’s entire genome, including clear markers of risks such as cancer and dementia. That’s in addition to being able to know one’s quantified self (QS), plus one’s health care history.

Yet all of that data is scattered and silo’d. This is why it is hard to integrate all our available QS data, and nearly impossible to integrate all our health care history. After I left the Harvard University Health Services (HUHS) system in 2010, my doctor at the time (Richard Donohue, MD, whom I recommend highly) obtained and handed over to me the entirety of my records from HUHS. It’s not data, however. It’s a pile of paper, as thick as the Manhattan phone book. Its utility to other doctors verges on nil. Such is the nature of the bizarre information asymmetry (and burial) in the current system.

On top of that, our health care system incentivizes us to conceal our history, especially if any of that history puts us in a higher risk category, sure to pay more in health insurance premiums.

But what happens when we solve these problems, and our digital selves become fully knowable — by both our selves and our health care providers? What happens to the risk calculation business we have today, which rationalizes more than 400,000 snuffed souls per annum as collateral damage? Do we go to single-payer then, for the simple reason that the best risk calculations are based on the nation’s entire population?

I don’t know.

I do know the current system doesn’t want to go there, on either the business or the policy side. But it will. Inevitably.

At the end of whatever day this is, our physical selves will know our data selves better than any system built to hoard and manage our personal data for their interests more than for ours. When that happens the current system will break, and another one will take its place.

How many more of us will die needlessly in the meantime? And does knowing (or guessing at) that number make any difference? It hasn’t so far.

But that shouldn’t stop us. Hats off to leadership in the direction of actually solving these problems, starting with Adrian Gropper, ePatient Dave, Patient Privacy RightsBrian Behlendorf, Esther Dyson, John Wilbanks, Tom Munnecke and countless other good people and organizations who have been pushing this rock up a hill for a long time, and aren’t about to stop. (Send Doc more names or add comments directly to this blog here.)

Courtesy of Doc Searls Weblog

Google’s $8.5M Privacy Pact Going To Inapt Orgs, Groups Say

“A coalition of privacy groups [including Patient Privacy Rights] stepped up its opposition to the proposed $8.5 million settlement of a California class action alleging Google Inc. illegally divulged search information, saying Wednesday that counsel has failed to show how the seven organizations chosen to receive cy pres funds are appropriate.”

To view the full article (only available by subscription), please visit Google’s $8.5M Privacy Pact Going To Inapt Orgs, Groups Say.