Article: Big brother to log your drinking habits and waist size as GPs are forced to hand over confidential records

To view the full article written by Jack Doyle, please visit: Big brother to log your drinking habits and waist size as GPs are forced to hand over confidential records

The UK government proposes to collect citizens’ health data in a “giant information bank”.  “A document outlining the scheme even raises the prospect of clinical data being passed on or sold to third parties”.

Quotes:

  • -Doctors will be forced to hand over sensitive information about patients as part of a new programme called Everyone Counts.
  • -The files will be stored in a giant information bank that privacy campaigners say represents the  ‘biggest data grab in NHS history’.
  • -Ross Anderson, professor of security engineering at Cambridge University, said: ‘Under these proposals, medical confidentiality is, in effect, dead and there is currently nobody standing in the way.’

David Cameron was criticized in the Guardian in 2011 when he first announced similar plans for collecting all citizens health data to:

  • -“encourage NHS ties with industry and fuel innovation, including £180m catalyst fund”
  • -encourage “collaboration between the health service and the life sciences industry”
  • -“make it easier for drug companies to run clinical trials in hospitals and to benefit from the NHS’s vast collection of patient data”.

The tens or hundreds of billions generated annually by sales of American citizens’ electronic health information are an attractive model for the UK and EU given the dire economic situation in the EU. It’s hard to know how large the market for health data is or how health data is used without a data map. See Professor Sweeney explain theDataMap research project at: http://tiny.cc/etyxrw

Americans can’t control who sees or uses their health data. Will UK citizens suffer the same fate?

Clouds in healthcare should be viewed as ominous- Quotes from Dr. Deborah Peel

A recent article in FierceEMR written by Marla Durben Hirsch quotes Dr. Peel about the dangers of cloud technology being used in healthcare. Dr. Peel tells FierceEMR that “There’s a lot of ignorance regarding safety and privacy of these [cloud] technologies”.

Here are a few key quotes from the story:

“It’s surely no safe haven for patient information; to the contrary it is especially vulnerable to security breaches. A lot of EHR vendors that offer cloud-based EHR systems don’t take measures to keep patient data safe. Many of them don’t think they have to comply with HIPAA’s privacy and security rules, and many of their provider clients aren’t requiring their vendors to do so.” (Hirsch)

“Many providers have no idea where the vendor is hosting the providers’ patient data. It could be housed in a different state; or even outside of the country, leaving it even more vulnerable. ‘If the cloud vendor won’t tell you where the information is, walk out the door,’ Peel says.”

“Then there’s the problem of what happens to your data when your contract with the cloud vendor ends. Providers don’t pay attention to that when they sign their EHR contract, Peel warns.”

“‘The cloud can be a good place for health information if you have iron clad privacy and security protections,’ Peel says. ‘[But] people shouldn’t have to worry about their data wherever it’s held.’”

OCR Could Include Cloud Provision in Forthcoming Omnibus HIPAA Rule

The quotes below are from an article written by Alex Ruoff in the Bloomberg Health IT Law and Industry Report.

“Deborah Peel, founder of Patient Privacy Rights, said few providers understand how HIPAA rules apply to cloud computing. This is a growing concern among consumer groups, she said, as small health practices are turning to cloud computing to manage their electronic health information. Cloud computing solutions are seen as ideal for small health practices as they do not require additional staff to manage information systems, Peel said.
Cloud computing for health care requires the storage of protected health information in the cloud—a shared electronic environment—typically managed outside the health care organization accessing or generating the data (see previous article).
Little is known about the security of data managed by cloud service providers, Nicolas Terry, co-director of the Hall Center for Law and Health at Indiana University, said. Many privacy advocates are concerned that cloud storage, because it often stores information on the internet, is not properly secured, Terry said. He pointed to the April 17 agreement between Phoenix Cardiac Surgery and HHS in which the surgery practice agreed to pay $100,000 to settle allegations it violated HIPAA Security Rules (see previous article).
Phoenix was using a cloud-based application to maintain protected health information that was available on the internet and had no privacy and security controls.

Demands for Guidance

Peel’s group, in the Dec. 19 letter, called for guidance “that highlights the lessons learned from the Phoenix Cardiac Surgery case while making clear that HIPAA does not prevent providers from moving to the cloud.”

Peel’s letter asked for:
• technical safeguards for cloud computing solutions, such as risk assessments of and auditing controls for cloud-based health information technologies;
• security standards that establish the use and disclosure of individually identifiable information stored on clouds; and
• requirements for cloud solution providers and covered entities to enter into a business associate agreement outlining the terms of use for health information managed by the cloud provider.”

Patient privacy group (PPR) asks HHS for HIPAA cloud guidance

Government HealthIT recently wrote an article about Dr. Peel’s of Patient Privacy Rights’ letter to the HHS Office for Civil Rights pushing for security guidelines, standards, and enforcements for cloud technology being used in healthcare.

Here are a few key points highlighted in the article:

“Issuing guidance to strengthen and clarify cloud-based protections for data security and privacy will help assure patients (that) sensitive health data they share with their physicians and other health care professionals will be protected,” Peel said.

“Cloud-computing is proving to be valuable, Peel said, but the nation’s transition to electronic health records will be slowed ‘if patients do not have assurances that their personal medical information will always have comprehensive and meaningful security and privacy protections.’”

“Patient Privacy Rights, a group founded in 2006, is encouraging HHS to adopt guidelines that highlight ‘the lessons learned from the Phoenix Cardiac Surgery case while making it clear that HIPAA does not prevent providers from moving to the cloud as long as it is done responsibly and in compliance with the law.’”

“In general, Peel said, cloud providers and the healthcare industry at large could benefit from guidance and education on the application of federal privacy and security rules in the cloud. ‘HHS and HIPAA guidance in this area, to date, is limited,’ Peel said, recommending the National Institute of Standards and Technology’s cloud privacy guidelines as a baseline.”

Kravis Backs N.Y. Startups Using Apps to Cut Health Costs

The title should have been: “Wall Street trumps the Hippocratic Oath and NY patients’ privacy” or “NY gives technology start-ups free access to millions of New Yorkers sensitive health data without informed consent starting in February”.

Of course we need apps to lower health costs, coordinate care, and help people get well, but apps should be developed using ‘synthetic’ data, not real patient data. Giving away valuable identifiable patient data to app developers is very risky and violates patients legal and ethical rights to health information privacy under state and federal law—each of us has strong rights to decide who can see and use personal health information.

What happens when app developers use, disclose or sell Mayor Bloomberg’s, Governor Cuomo’s, Sec of State Hillary Clinton’s, or Peter Thiel’s electronic health records? Or will access to prominent people’s health records be blocked by the data exchange, while everyone’s else’s future jobs and credit are put at risk by developer access to health data?  Will Bloomberg publish a story about the consequences of this decision by whoever runs the NY health data exchange? Will Bloomberg write about the value, sale, and massive technology-enabled exploitation of health data for discrimination and targeted marketing of drugs, treatments, or for extortion of political or business enemies? Natasha Singer of the NYTimes calls this the ‘surveillance economy’.

The story did not mention ways to develop apps that protect patients’ sensitive information from disclosure to people not directly involved in patient care. The story could have said that the military uses “synthetic” patient data for technology research and app development. They realize that NOT protecting the security and privacy of sensitive data of members of the military and their families creates major national security risks.  The military builds and tests technology and apps on synthetic data; researchers or app developers don’t get access to real, live patient data without tough security clearances and high-level review of those who are granted permission to access data for approved projects that benefit patients. Open access to military health data bases threatens national security. Will open access to New Yorkers’ health data also threaten national security?

NY just started a national and international gold rush to develop blockbuster health apps AND will set off a rush by other states to give away or sell identifiable patient health information in health information exchanges (HIEs) or health information organizations (HIOs)—-by allowing technology developers access to an incredibly large, valuable data base of identifiable patient health information.  Do the developers get the data free—or is NY selling health data? The bipartisan Coalition for Patient Privacy (represents 10.3M people) worked to get a ban on the sale of patient health data into the stimulus bill because the hidden sale of health data is a major industry that enables hidden discrimination in key life opportunities like jobs and credit. Selling patient data for all sorts of uses is a very lucrative industry.

Further, NY patients are being grossly misled: they think they gave consent ONLY for their health data to be exchanged so other health professionals can treat them. Are they informed that dozens of app developers will be able to copy all their personal health data to build technology products they may not want or be interested in starting in February?

Worst of all the consequences of systems that eliminate privacy is: patients to act in ways that risk their health and lives when they know their health information is not private:

  • -600K/year avoid early treatment and diagnosis for cancer because they know their records will not be private
  • -2M/year avoid early treatment and diagnosis for depression for the same reasons
  • -millions/year avoid early treatment and diagnosis of STDs, for the same reason
  • -1/8 hide data, omit or lie to try to keep sensitive information private

More questions:

  • -What proof is there that the app developers comply with the contracts they sign?
  • -Are they audited to prove the identifiable patient data is truly secure and not sold or disclosed to third parties?
  • -What happens when an app developer suffers a privacy breach—most health data today is not secure or encrypted? If the app developers signed Business Associate Agreements at least they would have to report the data breaches.
  • -What happens when many of the app developers can’t sell their products or the businesses go bust? They will sell the patient data they used to develop the apps for cash.
  • -The developers reportedly signed data use agreements “covering federal privacy rules”, which probably means they are required to comply with HIPAA.  But HIPAA allows data holders to disclose and sell patient data to third parties, promoting further hidden uses of personal data that patients will never know about, much less be able to agree to.  Using contracts that do not require external auditing to protect sensitive information and not requiring proof that the developers can be trusted is a bad business practice.

NY has opened Pandora’s box and not even involved the public in an informed debate.

Benefits of Online Medical Records Outweigh the Risks- Includes Opposing Quotes from Dr. Deborah Peel

An article written by Larry Magid in the Huffington Post quotes PPR when speaking about the issues surrounding electronic health records. You can view the full article here: Benefits of Online Medical Records Outweigh the Risks.

“There are also privacy concerns. In a 2010 Wall Street Journal op-ed, psychiatrist Deborah Peel, founder of Patient Privacy Rights, complained that ‘lab test results are disclosed to insurance companies before we even know the results.’ She added that data is being released to ‘insurers, drug companies, employers and others willing to pay for the information to use in making decisions about you, your job or your treatments, or for research.’ Her group is calling for tighter controls and recognition that “that patients own their health data.’”

Two University of Miami Hospital Employees May Have Stolen & Sold Patient Data

To view the full Miami Herald article, please visit: Two University of Miami Hospital Employees May Have Stolen & Sold Patient Data

Two hospital employees are accused of stealing thousands of “face-sheets” from the University of Miami Hospital over a 22-month period. These “face-sheets” included information such as name, address, reason for visiting, insurance policy number (note: Medicare and Medicaid use SSNs as insurance policy numbers), date of birth and the last four digits of the social security number. The employees have admitted to their improper conduct and were terminated immediately, but the lasting damage of the stolen information is still being addressed by the hospital and there is no information about how many of these sheets may have been taken. In a statement released released by the hospital, it was revealed that there is “no indication that medical records are at risk”.

Privacy and Data Management on Mobile Devices

See this link for the entire survey of 1,954 cell phone users (see excerpt below): http://pewinternet.org/~/media//Files/Reports/2012/PIP_MobilePrivacyManagement.pdf

When the public learns about hidden data use and collection on cell phones,  significant numbers of people TURN them OFF:

  • -“57% of all app users have either uninstalled an app over concerns about having to share their personal information, or declined to install an app in the first place”

What will the public do when they realize they CANNOT turn off:

  • -hundreds of software ‘applications’ at hospitals that collect, use, and sell their health information
  • -thousands of EHRs and other health information technologies that collect, use, and sell their health information
  • -health-related websites that collect, use, and sell their health information

Patient Trust in Confidentiality Affects Health Decisions

To view the full article by Pablo Valerio, please visit Enterprise Efficiency: Patient Trust in Confidentiality Affects Health Decisions

This article highlights a survey sponsored by FairWarning that looks at how “patient privacy considerations impact the actual delivery of healthcare” in the UK and US.

Key quotes from the story:

-”CIOs and healthcare providers need to ensure the best security, not only because it is the law, but because data breaches actually affect how honest a patient might be with a doctor and how quickly they will seek medical attention.”

-”It is not enough to comply with government regulations about data protection. If a data breach occurs patients are not going to check if the institution was following rules, they are going to blame their executives for allowing the breach to happen, regardless of the reasons.”

The survey, “UK: How Privacy Considerations Drive Patient Decisions and Impact Patient Care Outcomes; Trust in the confidentiality of medical records influences when, where, who and what kind of medical treatment is delivered to patients” cited in the article below compares attitudes about health information privacy in the UK and US.

Some key UK findings are:

-38.3 percent stated they have or would postpone seeking care for a sensitive medical condition due to privacy concerns

-More than half of patients stated that if they had a sensitive medical condition, they would withhold information from their care provider.

-Nearly 2 out of 5 stated they would postpone seeking care out of privacy concerns.

-45.1 percent would seek care outside of their community due to privacy concerns

-37 percent would travel… 30 miles or more, to avoid being treated at a hospital they did not trust

US vs UK patients:

-UK patients are almost twice as likely to withhold information from their care provider…if they had a poor record of protecting patient privacy.

-4 out of 10 UK patients versus nearly 3 out of 10 US patients … would put off seeking care … due to privacy concerns.

-97 percent of UK and US patients stated chief executives and healthcare providers have a legal and ethical responsibility to protect patients’ medical records from being breached.

Crunch Two Data Sets, Call Me in the Morning

See full article in Bloomberg Businessweek Article

As hospitals are acquiring more and more digital patient data, they are quickly turning to “Big Data” tech companies with expertise in data-mining, which “has already led to some measurable improvements in patient care” according to hospital administration. However, patients are rarely notified when their records are being used in this way because the data is exempt from federal privacy protection due to their necessity for “quality improvement”. “People do not like to have researchers of any stripe using their electronic health records”, says Deborah Peel, MD of Patient Privacy Rights. “As a matter of respect and autonomy and patient-centeredness, patients want to be asked. When they are asked, by and large, they support this. It’s the not-being-asked stuff that’s really bad”. A breakdown in patient-physician trust about data privacy can cause huge problems with patient care arising from patients refusing to share all necessary information with physicians as a means to avoid exposure.