Trust must be mutual for patient engagement to work

“A recent study in the Journal of the American Informatics Association reports that nearly one in eight patients has withheld information from their healthcare providers due to security concerns. Moreover, most of the respondents were very concerned about the security of their information when it was being shared electronically or by fax. Just last week, advocacy organization Patient Privacy Rights sent a letter to the U.S. Department of Health & Human Services urging the agency to improve privacy protections of patients’ electronic health records, particularly in the cloud and in HIEs.”

Read more: Trust must be mutual for patient engagement to work – FierceEMR http://www.fierceemr.com/story/trust-must-be-mutual-patient-engagement-work/2013-09-18#ixzz2fRtzIBsV
Subscribe at FierceEMR

Hackers Sell Health Insurance Credentials, Bank Accounts, SSNs and Counterfeit Documents, for over $1,000 Per Dossier

The value of personal health information is very high inside and outside of the US healthcare system. At the same time, the US healthcare industry as a whole does a terrible job of protecting health data security. Most health data holders (hospitals and insurers) put health data security protection dead last on the list for tech upgrades.
Besides the lack of effective, comprehensive data security protections, thousands of low-level employees can snoop in millions of people’s health records in every US hospital using electronic records.

The public expects that only their doctors and staff who are part of their treatment team can access their sensitive health records, but that’s wrong. Any staff members of a hospital or employees of a health IT company who are your neighbors, relatives, or stalkers/abusers can easily snoop in your records.
In Austin, TX the two major city hospital chains each allow thousands of doctors and nurses access to millions of patient records.
All this will get much worse when every state requires our health data to be “exchanged” with thousands more strangers. The new state health information exchanges (HIEs) will make data theft, sale,  and exposure exponentially worse.
Tell every law maker you know: all HIEs should be REQUIRED by law to ask you to agree or OPT-IN before your health data can be shared or disclosed.

Today:

  • -many states do not allow you to ‘opt-out’ of HIE data sharing
  • -most states do not allow you to prevent even very sensitive health data (like psychiatric records) from being exchanged

There is no way to trust electronic health systems or HIEs unless our rights to control who can see and use our electronic health data are restored.

HIStalk News 3/22/13 – Quotes Dr. Deborah Peel on new CVS policy

To view the full article, please visit HIStalk News 3/22/13.

Key quote from the article:

“Patient Privacy Rights Founder Deborah Peel, MD calls a new CVS employee policy that charges employees who decline obesity checks $50 per month “incredibly coercive and invasive.” CVS covers the cost of an assessment of height, weight, body fat, blood pressure, and serum glucose and lipid levels, but also reserves the right to send the results to a health management firm even though CVS management won’t have access to the results directly. Peel says a lack of chain of custody requirements means that CVS could review the information and use it to make personnel decisions.”

Should the U.S. Adopt European-Style Data-Privacy Protections?

View the full article at Should the U.S. Adopt European-Style Data-Privacy Protections?

This urgent issue will be debated at the 3rd International Summit on the Future of Health Privacy in Washington, DC on June 5-6, 2013 at Georgetown Law Center.

The opening keynote will be Peter Hustinx, the EU Data Protection Supervisor. He will speak on “A health check on data privacy?”

Register to attend at www.healthprivacysummit.org . Later we will post a link to watch via live-streaming video.

How the Insurer Knows You Just Stocked Up on Ice Cream and Beer

View the full article at How the Insurer Knows You Just Stocked Up on Ice Cream and Beer.

Your employer already has access to personal medical information such as how often you get check ups and whether you’re taking prescription mediation through your insurance carrier, but now some companies are beginning to monitor where you shop and what you eat.

Some key quotes from the article:

“…But companies also have started scrutinizing employees’ other behavior more discreetly. Blue Cross and Blue Shield of North Carolina recently began buying spending data on more than 3 million people in its employer group plans. If someone, say, purchases plus-size clothing, the health plan could flag him for potential obesity—and then call or send mailings offering weight-loss solutions.”

“Some critics worry that the methods cross the line between protective and invasive—and could lead to job discrimination. ‘It’s a slippery-slope deal,’ says Dr. Deborah Peel, founder of Patient Privacy Rights, which advocates for medical-data confidentiality. She worries employers could conceivably make other conclusions about people who load up the cart with butter and sugar.”

“Analytics firms and health insurers say they obey medical-privacy regulations, and employers never see the staff’s personal health profiles but only an aggregate picture of their health needs and expected costs. And if the targeted approach feels too intrusive, employees can ask to be placed on the wellness program’s do-not-call list.”

Rekindling the patient ID debate

Unique patient identifiers pose enormous implications for patient control and privacy. Dr. Deborah Peel is quoted in this article explaining how detrimental UPIs will be for patient trust and safety. To view the full article, please visit Rekindling the patient ID debate.

Key Quotations:

“The idea of unique patient identifiers (UPIs) is not a concept extracted from the next dystopian novel. It could very well be reality in the not-so-distant future. The question remaining, however, is whether or not the benefits of such technology outweigh constitutional privacy and patient trust concerns.”

“Deborah Peel, MD, founder of Patient Privacy Rights, and a fierce opponent of UPIs, writes in a Jan. 23 Wall Street Journalarticle, ‘In the end, cutting out the patient will mean the erosion of patient trust. And the less we trust the system, the more patients will put health and life at risk to protect their privacy.’

Peel points to the present reality of patient health information – genetic tests, claims data and prescription records – already being sold and commercialized. ‘Universal healthcare IDs would only exacerbate such practices,’ she avers.”

Clouds in healthcare should be viewed as ominous- Quotes from Dr. Deborah Peel

A recent article in FierceEMR written by Marla Durben Hirsch quotes Dr. Peel about the dangers of cloud technology being used in healthcare. Dr. Peel tells FierceEMR that “There’s a lot of ignorance regarding safety and privacy of these [cloud] technologies”.

Here are a few key quotes from the story:

“It’s surely no safe haven for patient information; to the contrary it is especially vulnerable to security breaches. A lot of EHR vendors that offer cloud-based EHR systems don’t take measures to keep patient data safe. Many of them don’t think they have to comply with HIPAA’s privacy and security rules, and many of their provider clients aren’t requiring their vendors to do so.” (Hirsch)

“Many providers have no idea where the vendor is hosting the providers’ patient data. It could be housed in a different state; or even outside of the country, leaving it even more vulnerable. ‘If the cloud vendor won’t tell you where the information is, walk out the door,’ Peel says.”

“Then there’s the problem of what happens to your data when your contract with the cloud vendor ends. Providers don’t pay attention to that when they sign their EHR contract, Peel warns.”

“‘The cloud can be a good place for health information if you have iron clad privacy and security protections,’ Peel says. ‘[But] people shouldn’t have to worry about their data wherever it’s held.’”

Patient privacy group (PPR) asks HHS for HIPAA cloud guidance

Government HealthIT recently wrote an article about Dr. Peel’s of Patient Privacy Rights’ letter to the HHS Office for Civil Rights pushing for security guidelines, standards, and enforcements for cloud technology being used in healthcare.

Here are a few key points highlighted in the article:

“Issuing guidance to strengthen and clarify cloud-based protections for data security and privacy will help assure patients (that) sensitive health data they share with their physicians and other health care professionals will be protected,” Peel said.

“Cloud-computing is proving to be valuable, Peel said, but the nation’s transition to electronic health records will be slowed ‘if patients do not have assurances that their personal medical information will always have comprehensive and meaningful security and privacy protections.’”

“Patient Privacy Rights, a group founded in 2006, is encouraging HHS to adopt guidelines that highlight ‘the lessons learned from the Phoenix Cardiac Surgery case while making it clear that HIPAA does not prevent providers from moving to the cloud as long as it is done responsibly and in compliance with the law.’”

“In general, Peel said, cloud providers and the healthcare industry at large could benefit from guidance and education on the application of federal privacy and security rules in the cloud. ‘HHS and HIPAA guidance in this area, to date, is limited,’ Peel said, recommending the National Institute of Standards and Technology’s cloud privacy guidelines as a baseline.”

Kravis Backs N.Y. Startups Using Apps to Cut Health Costs

The title should have been: “Wall Street trumps the Hippocratic Oath and NY patients’ privacy” or “NY gives technology start-ups free access to millions of New Yorkers sensitive health data without informed consent starting in February”.

Of course we need apps to lower health costs, coordinate care, and help people get well, but apps should be developed using ‘synthetic’ data, not real patient data. Giving away valuable identifiable patient data to app developers is very risky and violates patients legal and ethical rights to health information privacy under state and federal law—each of us has strong rights to decide who can see and use personal health information.

What happens when app developers use, disclose or sell Mayor Bloomberg’s, Governor Cuomo’s, Sec of State Hillary Clinton’s, or Peter Thiel’s electronic health records? Or will access to prominent people’s health records be blocked by the data exchange, while everyone’s else’s future jobs and credit are put at risk by developer access to health data?  Will Bloomberg publish a story about the consequences of this decision by whoever runs the NY health data exchange? Will Bloomberg write about the value, sale, and massive technology-enabled exploitation of health data for discrimination and targeted marketing of drugs, treatments, or for extortion of political or business enemies? Natasha Singer of the NYTimes calls this the ‘surveillance economy’.

The story did not mention ways to develop apps that protect patients’ sensitive information from disclosure to people not directly involved in patient care. The story could have said that the military uses “synthetic” patient data for technology research and app development. They realize that NOT protecting the security and privacy of sensitive data of members of the military and their families creates major national security risks.  The military builds and tests technology and apps on synthetic data; researchers or app developers don’t get access to real, live patient data without tough security clearances and high-level review of those who are granted permission to access data for approved projects that benefit patients. Open access to military health data bases threatens national security. Will open access to New Yorkers’ health data also threaten national security?

NY just started a national and international gold rush to develop blockbuster health apps AND will set off a rush by other states to give away or sell identifiable patient health information in health information exchanges (HIEs) or health information organizations (HIOs)—-by allowing technology developers access to an incredibly large, valuable data base of identifiable patient health information.  Do the developers get the data free—or is NY selling health data? The bipartisan Coalition for Patient Privacy (represents 10.3M people) worked to get a ban on the sale of patient health data into the stimulus bill because the hidden sale of health data is a major industry that enables hidden discrimination in key life opportunities like jobs and credit. Selling patient data for all sorts of uses is a very lucrative industry.

Further, NY patients are being grossly misled: they think they gave consent ONLY for their health data to be exchanged so other health professionals can treat them. Are they informed that dozens of app developers will be able to copy all their personal health data to build technology products they may not want or be interested in starting in February?

Worst of all the consequences of systems that eliminate privacy is: patients to act in ways that risk their health and lives when they know their health information is not private:

  • -600K/year avoid early treatment and diagnosis for cancer because they know their records will not be private
  • -2M/year avoid early treatment and diagnosis for depression for the same reasons
  • -millions/year avoid early treatment and diagnosis of STDs, for the same reason
  • -1/8 hide data, omit or lie to try to keep sensitive information private

More questions:

  • -What proof is there that the app developers comply with the contracts they sign?
  • -Are they audited to prove the identifiable patient data is truly secure and not sold or disclosed to third parties?
  • -What happens when an app developer suffers a privacy breach—most health data today is not secure or encrypted? If the app developers signed Business Associate Agreements at least they would have to report the data breaches.
  • -What happens when many of the app developers can’t sell their products or the businesses go bust? They will sell the patient data they used to develop the apps for cash.
  • -The developers reportedly signed data use agreements “covering federal privacy rules”, which probably means they are required to comply with HIPAA.  But HIPAA allows data holders to disclose and sell patient data to third parties, promoting further hidden uses of personal data that patients will never know about, much less be able to agree to.  Using contracts that do not require external auditing to protect sensitive information and not requiring proof that the developers can be trusted is a bad business practice.

NY has opened Pandora’s box and not even involved the public in an informed debate.

Sizing Up De-Identification Guidance, Experts Analyze HIPAA Compliance Report (quotes PPR)

To view the full article by Marianne Kolbasuk McGee, please visit: Sizing Up De-Identification Guidance, Experts Analyze HIPAA Compliance Report.

The federal Office of Civil Rights (OCR), charged with protecting the privacy of nation’s health data, released a ‘guidance’ for “de-identifying” health data. Government agencies and corporations want to “de-identify”, release and sell health data for many uses. There are no penalties for not following the ‘guidance’.

Releasing large data bases with “de-identified” health data on thousands or millions of people could enable break-through research to improve health, lower costs, and improve quality of care—-IF “de-identification” actually protected our privacy, so no one knows it’s our personal data—-but it doesn’t.

The ‘guidance’ allows easy ‘re-identification’ of health data. Publically available data bases of other personal information can be quickly compared electronically with ‘de-identified’ health data bases, so can be names re-attached, creating valuable, identifiable health data sets.

The “de-identification” methods OCR proposed are:

  • -The HIPAA “Safe-Harbor” method:  if 18 specific identifiers are removed (such as name, address, age, etc, etc), data can be released without patient consent. But .04% of the data can still be ‘re-identified’
  • -Certification by a statistical  “expert” that the re-identification risk is “small” allows release of data bases without patient consent.

o   There are no requirements to be an “expert”

o   There is no definition of “small risk”

Inadequate “de-identification” of health data makes it a big target for re-identification. Health data is so valuable because it can be used for job and credit discrimination and for targeted product marketing of drugs and expensive treatment. The collection and sale of intimately detailed profiles of every person in the US is a major model for online businesses.

The OCR guidance ignores computer science, which has demonstrated ‘de-identification’ methods can’t prevent re-identification. No single method or approach can work because more and more ‘personally identifiable information’ is becoming publically available, making it easier and easier to re-identify health data.  See: the “Myths and Fallacies of “Personally Identifiable Information” by Narayanan and Shmatikov,  June 2010 at: http://www.cs.utexas.edu/~shmat/shmat_cacm10.pdf Key quotes from the article:

  • -“Powerful re-identification algorithms demonstrate not just a flaw in a specific anonymization technique(s), but the fundamental inadequacy of the entire privacy protection paradigm based on “de-identifying” the data.”
  • -“Any information that distinguishes one person from another can be used for re-identifying data.”
  • -“Privacy protection has to be built and reasoned about on a case-by-case basis.”

OCR should have recommended what Shmatikov and Narayanan proposed:  case-by-case ‘adversarial testing’ by comparing a “de-identified” health data base to multiple publically available data bases to determine which data fields must be removed to prevent re-identification. See PPR’s paper on “adversarial testing” at: http://patientprivacyrights.org/wp-content/uploads/2010/10/ABlumberg-anonymization-memo.pdf

Simplest, cheapest, and best of all would be to use the stimulus billions to build electronic systems so patients can electronically consent to data use for research and other uses they approve of.  Complex, expensive contracts and difficult ‘work-arounds’ (like ‘adversarial testing’) are needed to protect patient privacy because institutions, not patients, control who can use health data. This is not what the public expects and prevents us from exercising our individual rights to decide who can see and use personal health information.