Questions of Privacy

ModernHealthcare.com recently posted a great article about PPR’s Dr. Deborah Peel and her work.

A few key points from the article:

“In 2002, HHS redrafted the privacy rule of the Health Insurance Portability and Accountability Act, replacing its patient consent requirement for the sharing of most patient records with a new provision. The rewrite afforded ‘regulatory permission,’ according to the rule, for hospitals, physicians, insurance companies, pharmacies, claims clearinghouses and other HIPAA-covered entities to use and disclose patient data for treatment, payment and a long list of other healthcare operations without patient consent.”

“’Let’s face it,’ Peel says, ‘HHS is the agency that eliminated patient control over electronic medical records and has remained hostile to patients’ rights ever since.’”

“‘Where I’m coming from is, I’ve spent all this time in a profession with people being hurt,’ Peel says. ‘Starting in the 1970s, when I first let out my shingle, people came to me and said, if I paid you in cash, would you keep my records private. Now, we’ve got a situation where you don’t even know where all your records are. We don’t have a chain of custody for our data, or have a data map’ to track its location.”

Can computers predict medical problems? VA thinks maybe.

To view the full article written by Bob Brewin for Nextgov, please visit Can computers predict medical problems? VA thinks maybe.

“The Veterans Health Administration plans to test how advanced clinical reasoning and prediction systems can use massive amounts of archived patient data to help improve care, efficiency and health outcomes.”

Two veterans commented on the story below:

  • -“total invasion of privacy, I have a big problem with a “vendor” going through my records let alone the VA. the VA doesnt exactly have a good track record of protecting information”
  • -“veterans are NO LONGER guinea pigs without express PRIOR written consent, that is MEDICAL DATA covered by HIPAA, and is expressly forbidden to be managed in an open fashion and is NOT for sale.”

Like 99% of Americans, these vets oppose research use of their health information without consent:

US health IT systems and the VA could offer electronic consent to participate in studies:

  • -Electronic consent tools can enable each patient to set his or her own broad rules to allow research use of their health data.
  • -Vets could be ‘pinged’ for consent for EACH study, set broad rules to allow use of data for all studies, or set their rules for something in between (such as: I will agree to all research use of my data on traumatic brain injury and PTSD, but contact me for consent for all other studies).

Unfortunately the new Omnibus Privacy Rule grants open access to all 300 million citizens’ sensitive health information without consent for any ‘research’ or ‘public health’ use.
The broad ‘research loophole’ in HIPAA and the new Omnibus Privacy Rule permits industry (corporations including insurers, employers, drug companies, marketers, pharmacies, labs, and others) to use and sell our personal data for “research” that we would never agree with. ‘Research’ is defined so broadly that:

  • -Blue Health Intelligence (a subsidiary of Blue Cross Blue Shield) does ‘research’. It uses and sells enrollees’ health data without consent.
  • -IMS Health data mines and sells the nation’s prescription records. Claiming to do ‘research’ allows IMS Health to use and sell Americans’ prescription records without consent.
  • -Many electronic health record companies (Cerner, GE Centricity, Greenway, Athena Health, and Practice Fusion) are also ‘research companies’ and sell health data.
  • -The ‘research’ industry sells data that is supposedly ‘de-identified’, but health data is easy to re-identify (See paper by Narayanan and Shmatikov:
  • http://www.cs.utexas.edu/~shmat/shmat_cacm10.pdf ). And there is no way to know when ‘de-identified’ data is re-identified. Texas law bans re-identification’ of health data, but the system depends on whistleblowers to report violations.
  • -Most ‘researchers’ are not physicians, scholars, and PhDs at academic centers, as the public assumes.

Why wouldn’t every corporation that touches health data declare itself a ‘research institution’ so it can collect, use, and sell Americans’ health data? Personal health information is THE MOST valuable data of all, but we have no way to control which corporations collect and use health data.
How large a part of the surveillance economy is personal health data?

Cloud Computing: HIPAA’s Role

The below excerpts are taken from the GOVinfoSecurity.com article Cloud Computing: HIPAA’s Role written by Marianne Kolbasuk McGee after the January 7, 2013 Panel in Washington D.C.: Health Care, the Cloud, & Privacy.

“While a privacy advocate is demanding federal guidance on how to protect health information in the cloud, one federal official says the soon-to-be-modified HIPAA privacy and security rules will apply to all business associates, including cloud vendors, helping to ensure patient data is safeguarded.

Joy Pritts, chief privacy officer in the Office of the National Coordinator for Health IT, a unit of the Department of Health and Human Services, made her comments about HIPAA during a Jan. 7 panel discussion on cloud computing hosted by Patient Privacy Rights, an advocacy group…

…Deborah Peel, M.D., founder of Patient Privacy Rights, last month sent a letter to the Department of Health and Human Services’ Office for Civil Rights urging HHS to issue guidance to healthcare providers about data security and privacy in the cloud (see: Cloud Computing: Security a Hurdle).

“The letter … asks that [HHS] look at the key problems in cloud … and what practitioners should know and understand about security and privacy of health data in the cloud,” Peel said during the panel.”

OCR Could Include Cloud Provision in Forthcoming Omnibus HIPAA Rule

The quotes below are from an article written by Alex Ruoff in the Bloomberg Health IT Law and Industry Report.

“Deborah Peel, founder of Patient Privacy Rights, said few providers understand how HIPAA rules apply to cloud computing. This is a growing concern among consumer groups, she said, as small health practices are turning to cloud computing to manage their electronic health information. Cloud computing solutions are seen as ideal for small health practices as they do not require additional staff to manage information systems, Peel said.
Cloud computing for health care requires the storage of protected health information in the cloud—a shared electronic environment—typically managed outside the health care organization accessing or generating the data (see previous article).
Little is known about the security of data managed by cloud service providers, Nicolas Terry, co-director of the Hall Center for Law and Health at Indiana University, said. Many privacy advocates are concerned that cloud storage, because it often stores information on the internet, is not properly secured, Terry said. He pointed to the April 17 agreement between Phoenix Cardiac Surgery and HHS in which the surgery practice agreed to pay $100,000 to settle allegations it violated HIPAA Security Rules (see previous article).
Phoenix was using a cloud-based application to maintain protected health information that was available on the internet and had no privacy and security controls.

Demands for Guidance

Peel’s group, in the Dec. 19 letter, called for guidance “that highlights the lessons learned from the Phoenix Cardiac Surgery case while making clear that HIPAA does not prevent providers from moving to the cloud.”

Peel’s letter asked for:
• technical safeguards for cloud computing solutions, such as risk assessments of and auditing controls for cloud-based health information technologies;
• security standards that establish the use and disclosure of individually identifiable information stored on clouds; and
• requirements for cloud solution providers and covered entities to enter into a business associate agreement outlining the terms of use for health information managed by the cloud provider.”

OCR Could Include Cloud Provision in Forthcoming Omnibus HIPAA Rule

The below excerpt is from the Bloomberg BNA article OCR Could Include Provision in Forthcoming Omnibus HIPAA Rule written by Alex Ruoff. The article is available by subscription only.

“The final omnibus rule to update Health Insurance Portability and Accountability Act regulations, expected to come out sometime early this year, could provide guidance for health care providers utilizing cloud computing technology to manage their electronic health record systems, the chief privacy officer for the Office of the National Coordinator for Health Information Technology said Jan. 7 during a panel discussion on cloud computing.

The omnibus rule is expected to address the health information security and privacy requirements for business associates of covered entities, provisions that could affect how the HIPAA Privacy Rule affects service providers that contract with health care entities, Joy Pritts, chief privacy officer for ONC, said during the panel, hosted by the consumer advocacy group, Patient Privacy Rights (PPR).

PPR Dec. 19 sent a letter to Health and Human Services’ Office for Civil Rights Director Leon Rodriguez, asking the agency to issue guidance on cloud computing security. PPR leaders say they have not received a response…

…Deborah Peel, founder of Patient Privacy Rights, said few providers understand how HIPAA rules apply to cloud computing. This is a growing concern among consumer groups, she said, as small health practices are turning to cloud computing to manage their electronic health information.”

Re: Open data is not a panacea

Regarding the story on MathBabe.org titled Open data is not a panacea

This story is a much-needed tonic to the heavy industry and government spin promoting ONLY the benefits of “open data” without mentioning the harms.

Quotes from the story:

  • When important data goes public, the edge goes to the most sophisticated data engineer, not the general public. The Goldman Sachs’s of the world will always know how to make use of “freely available to everyone” data before the average guy.
  • If there’s one thing I learned working in finance, it’s not to be naive about how information will be used. You’ve got to learn to think like an asshole to really see what to worry about.
  • So, if you’re giving me information on where public schools need help, I’m going to imagine using that information to cut off credit for people who live nearby. If you tell me where environmental complaints are being served, I’m going to draw a map and see where they aren’t being served so I can take my questionable business practices there.

Patient Privacy Rights’ goal is a major overhaul of U.S. health technology systems, so your health data is NOT OPEN DATA. Your health data should only be “open” and used with your knowledge and informed consent for purposes you agree with, like treatment and research. It will take a major overhaul for the public to trust health IT systems.

Why does Patient Privacy Rights advocate for personal control over health information and against “open data”? Answer:

For reasons that are NOT apparent, the healthcare industry shuns learning from computer scientists, mathematicians, and privacy experts about the harms and risks posed by today’s poorly designed “open” healthcare technology systems, the Internet, and the “surveillance economy”.

The health care industry and government shun facts like:

YOU can help build a data map so industry and government are forced to stop pretending that the health information of every person in the US is safe, secure, and private. Donate at: http://patientprivacyrights.org/donate/

Patient privacy group (PPR) asks HHS for HIPAA cloud guidance

Government HealthIT recently wrote an article about Dr. Peel’s of Patient Privacy Rights’ letter to the HHS Office for Civil Rights pushing for security guidelines, standards, and enforcements for cloud technology being used in healthcare.

Here are a few key points highlighted in the article:

“Issuing guidance to strengthen and clarify cloud-based protections for data security and privacy will help assure patients (that) sensitive health data they share with their physicians and other health care professionals will be protected,” Peel said.

“Cloud-computing is proving to be valuable, Peel said, but the nation’s transition to electronic health records will be slowed ‘if patients do not have assurances that their personal medical information will always have comprehensive and meaningful security and privacy protections.’”

“Patient Privacy Rights, a group founded in 2006, is encouraging HHS to adopt guidelines that highlight ‘the lessons learned from the Phoenix Cardiac Surgery case while making it clear that HIPAA does not prevent providers from moving to the cloud as long as it is done responsibly and in compliance with the law.’”

“In general, Peel said, cloud providers and the healthcare industry at large could benefit from guidance and education on the application of federal privacy and security rules in the cloud. ‘HHS and HIPAA guidance in this area, to date, is limited,’ Peel said, recommending the National Institute of Standards and Technology’s cloud privacy guidelines as a baseline.”

Health-care sector vulnerable to hackers, researchers say

From the Wall Street Journal article by Robert O’Harrow Jr. titled Health-care sector vulnerable to hackers, researchers say

“As the health-care industry rushed onto the Internet in search of efficiencies and improved care in recent years, it has exposed a wide array of vulnerable hospital computers and medical devices to hacking, according to documents and interviews.

Security researchers warn that intruders could exploit known gaps to steal patients’ records for use in identity theft schemes and even launch disruptive attacks that could shut down critical hospital systems.

A year-long examination of cybersecurity by The Washington Post has found that health care is among the most vulnerable industries in the country, in part because it lags behind in addressing known problems.

“I have never seen an industry with more gaping security holes,” said Avi Rubin, a computer scientist and technical director of the Information Security Institute at Johns Hopkins University. “If our financial industry regarded security the way the health-care sector does, I would stuff my cash in a mattress under my bed.”"

Kravis Backs N.Y. Startups Using Apps to Cut Health Costs

The title should have been: “Wall Street trumps the Hippocratic Oath and NY patients’ privacy” or “NY gives technology start-ups free access to millions of New Yorkers sensitive health data without informed consent starting in February”.

Of course we need apps to lower health costs, coordinate care, and help people get well, but apps should be developed using ‘synthetic’ data, not real patient data. Giving away valuable identifiable patient data to app developers is very risky and violates patients legal and ethical rights to health information privacy under state and federal law—each of us has strong rights to decide who can see and use personal health information.

What happens when app developers use, disclose or sell Mayor Bloomberg’s, Governor Cuomo’s, Sec of State Hillary Clinton’s, or Peter Thiel’s electronic health records? Or will access to prominent people’s health records be blocked by the data exchange, while everyone’s else’s future jobs and credit are put at risk by developer access to health data?  Will Bloomberg publish a story about the consequences of this decision by whoever runs the NY health data exchange? Will Bloomberg write about the value, sale, and massive technology-enabled exploitation of health data for discrimination and targeted marketing of drugs, treatments, or for extortion of political or business enemies? Natasha Singer of the NYTimes calls this the ‘surveillance economy’.

The story did not mention ways to develop apps that protect patients’ sensitive information from disclosure to people not directly involved in patient care. The story could have said that the military uses “synthetic” patient data for technology research and app development. They realize that NOT protecting the security and privacy of sensitive data of members of the military and their families creates major national security risks.  The military builds and tests technology and apps on synthetic data; researchers or app developers don’t get access to real, live patient data without tough security clearances and high-level review of those who are granted permission to access data for approved projects that benefit patients. Open access to military health data bases threatens national security. Will open access to New Yorkers’ health data also threaten national security?

NY just started a national and international gold rush to develop blockbuster health apps AND will set off a rush by other states to give away or sell identifiable patient health information in health information exchanges (HIEs) or health information organizations (HIOs)—-by allowing technology developers access to an incredibly large, valuable data base of identifiable patient health information.  Do the developers get the data free—or is NY selling health data? The bipartisan Coalition for Patient Privacy (represents 10.3M people) worked to get a ban on the sale of patient health data into the stimulus bill because the hidden sale of health data is a major industry that enables hidden discrimination in key life opportunities like jobs and credit. Selling patient data for all sorts of uses is a very lucrative industry.

Further, NY patients are being grossly misled: they think they gave consent ONLY for their health data to be exchanged so other health professionals can treat them. Are they informed that dozens of app developers will be able to copy all their personal health data to build technology products they may not want or be interested in starting in February?

Worst of all the consequences of systems that eliminate privacy is: patients to act in ways that risk their health and lives when they know their health information is not private:

  • -600K/year avoid early treatment and diagnosis for cancer because they know their records will not be private
  • -2M/year avoid early treatment and diagnosis for depression for the same reasons
  • -millions/year avoid early treatment and diagnosis of STDs, for the same reason
  • -1/8 hide data, omit or lie to try to keep sensitive information private

More questions:

  • -What proof is there that the app developers comply with the contracts they sign?
  • -Are they audited to prove the identifiable patient data is truly secure and not sold or disclosed to third parties?
  • -What happens when an app developer suffers a privacy breach—most health data today is not secure or encrypted? If the app developers signed Business Associate Agreements at least they would have to report the data breaches.
  • -What happens when many of the app developers can’t sell their products or the businesses go bust? They will sell the patient data they used to develop the apps for cash.
  • -The developers reportedly signed data use agreements “covering federal privacy rules”, which probably means they are required to comply with HIPAA.  But HIPAA allows data holders to disclose and sell patient data to third parties, promoting further hidden uses of personal data that patients will never know about, much less be able to agree to.  Using contracts that do not require external auditing to protect sensitive information and not requiring proof that the developers can be trusted is a bad business practice.

NY has opened Pandora’s box and not even involved the public in an informed debate.

Sizing Up De-Identification Guidance, Experts Analyze HIPAA Compliance Report (quotes PPR)

To view the full article by Marianne Kolbasuk McGee, please visit: Sizing Up De-Identification Guidance, Experts Analyze HIPAA Compliance Report.

The federal Office of Civil Rights (OCR), charged with protecting the privacy of nation’s health data, released a ‘guidance’ for “de-identifying” health data. Government agencies and corporations want to “de-identify”, release and sell health data for many uses. There are no penalties for not following the ‘guidance’.

Releasing large data bases with “de-identified” health data on thousands or millions of people could enable break-through research to improve health, lower costs, and improve quality of care—-IF “de-identification” actually protected our privacy, so no one knows it’s our personal data—-but it doesn’t.

The ‘guidance’ allows easy ‘re-identification’ of health data. Publically available data bases of other personal information can be quickly compared electronically with ‘de-identified’ health data bases, so can be names re-attached, creating valuable, identifiable health data sets.

The “de-identification” methods OCR proposed are:

  • -The HIPAA “Safe-Harbor” method:  if 18 specific identifiers are removed (such as name, address, age, etc, etc), data can be released without patient consent. But .04% of the data can still be ‘re-identified’
  • -Certification by a statistical  “expert” that the re-identification risk is “small” allows release of data bases without patient consent.

o   There are no requirements to be an “expert”

o   There is no definition of “small risk”

Inadequate “de-identification” of health data makes it a big target for re-identification. Health data is so valuable because it can be used for job and credit discrimination and for targeted product marketing of drugs and expensive treatment. The collection and sale of intimately detailed profiles of every person in the US is a major model for online businesses.

The OCR guidance ignores computer science, which has demonstrated ‘de-identification’ methods can’t prevent re-identification. No single method or approach can work because more and more ‘personally identifiable information’ is becoming publically available, making it easier and easier to re-identify health data.  See: the “Myths and Fallacies of “Personally Identifiable Information” by Narayanan and Shmatikov,  June 2010 at: http://www.cs.utexas.edu/~shmat/shmat_cacm10.pdf Key quotes from the article:

  • -“Powerful re-identification algorithms demonstrate not just a flaw in a specific anonymization technique(s), but the fundamental inadequacy of the entire privacy protection paradigm based on “de-identifying” the data.”
  • -“Any information that distinguishes one person from another can be used for re-identifying data.”
  • -“Privacy protection has to be built and reasoned about on a case-by-case basis.”

OCR should have recommended what Shmatikov and Narayanan proposed:  case-by-case ‘adversarial testing’ by comparing a “de-identified” health data base to multiple publically available data bases to determine which data fields must be removed to prevent re-identification. See PPR’s paper on “adversarial testing” at: http://patientprivacyrights.org/wp-content/uploads/2010/10/ABlumberg-anonymization-memo.pdf

Simplest, cheapest, and best of all would be to use the stimulus billions to build electronic systems so patients can electronically consent to data use for research and other uses they approve of.  Complex, expensive contracts and difficult ‘work-arounds’ (like ‘adversarial testing’) are needed to protect patient privacy because institutions, not patients, control who can use health data. This is not what the public expects and prevents us from exercising our individual rights to decide who can see and use personal health information.