Kravis Backs N.Y. Startups Using Apps to Cut Health Costs

The title should have been: “Wall Street trumps the Hippocratic Oath and NY patients’ privacy” or “NY gives technology start-ups free access to millions of New Yorkers sensitive health data without informed consent starting in February”.

Of course we need apps to lower health costs, coordinate care, and help people get well, but apps should be developed using ‘synthetic’ data, not real patient data. Giving away valuable identifiable patient data to app developers is very risky and violates patients legal and ethical rights to health information privacy under state and federal law—each of us has strong rights to decide who can see and use personal health information.

What happens when app developers use, disclose or sell Mayor Bloomberg’s, Governor Cuomo’s, Sec of State Hillary Clinton’s, or Peter Thiel’s electronic health records? Or will access to prominent people’s health records be blocked by the data exchange, while everyone’s else’s future jobs and credit are put at risk by developer access to health data?  Will Bloomberg publish a story about the consequences of this decision by whoever runs the NY health data exchange? Will Bloomberg write about the value, sale, and massive technology-enabled exploitation of health data for discrimination and targeted marketing of drugs, treatments, or for extortion of political or business enemies? Natasha Singer of the NYTimes calls this the ‘surveillance economy’.

The story did not mention ways to develop apps that protect patients’ sensitive information from disclosure to people not directly involved in patient care. The story could have said that the military uses “synthetic” patient data for technology research and app development. They realize that NOT protecting the security and privacy of sensitive data of members of the military and their families creates major national security risks.  The military builds and tests technology and apps on synthetic data; researchers or app developers don’t get access to real, live patient data without tough security clearances and high-level review of those who are granted permission to access data for approved projects that benefit patients. Open access to military health data bases threatens national security. Will open access to New Yorkers’ health data also threaten national security?

NY just started a national and international gold rush to develop blockbuster health apps AND will set off a rush by other states to give away or sell identifiable patient health information in health information exchanges (HIEs) or health information organizations (HIOs)—-by allowing technology developers access to an incredibly large, valuable data base of identifiable patient health information.  Do the developers get the data free—or is NY selling health data? The bipartisan Coalition for Patient Privacy (represents 10.3M people) worked to get a ban on the sale of patient health data into the stimulus bill because the hidden sale of health data is a major industry that enables hidden discrimination in key life opportunities like jobs and credit. Selling patient data for all sorts of uses is a very lucrative industry.

Further, NY patients are being grossly misled: they think they gave consent ONLY for their health data to be exchanged so other health professionals can treat them. Are they informed that dozens of app developers will be able to copy all their personal health data to build technology products they may not want or be interested in starting in February?

Worst of all the consequences of systems that eliminate privacy is: patients to act in ways that risk their health and lives when they know their health information is not private:

  • -600K/year avoid early treatment and diagnosis for cancer because they know their records will not be private
  • -2M/year avoid early treatment and diagnosis for depression for the same reasons
  • -millions/year avoid early treatment and diagnosis of STDs, for the same reason
  • -1/8 hide data, omit or lie to try to keep sensitive information private

More questions:

  • -What proof is there that the app developers comply with the contracts they sign?
  • -Are they audited to prove the identifiable patient data is truly secure and not sold or disclosed to third parties?
  • -What happens when an app developer suffers a privacy breach—most health data today is not secure or encrypted? If the app developers signed Business Associate Agreements at least they would have to report the data breaches.
  • -What happens when many of the app developers can’t sell their products or the businesses go bust? They will sell the patient data they used to develop the apps for cash.
  • -The developers reportedly signed data use agreements “covering federal privacy rules”, which probably means they are required to comply with HIPAA.  But HIPAA allows data holders to disclose and sell patient data to third parties, promoting further hidden uses of personal data that patients will never know about, much less be able to agree to.  Using contracts that do not require external auditing to protect sensitive information and not requiring proof that the developers can be trusted is a bad business practice.

NY has opened Pandora’s box and not even involved the public in an informed debate.

Sizing Up De-Identification Guidance, Experts Analyze HIPAA Compliance Report (quotes PPR)

To view the full article by Marianne Kolbasuk McGee, please visit: Sizing Up De-Identification Guidance, Experts Analyze HIPAA Compliance Report.

The federal Office of Civil Rights (OCR), charged with protecting the privacy of nation’s health data, released a ‘guidance’ for “de-identifying” health data. Government agencies and corporations want to “de-identify”, release and sell health data for many uses. There are no penalties for not following the ‘guidance’.

Releasing large data bases with “de-identified” health data on thousands or millions of people could enable break-through research to improve health, lower costs, and improve quality of care—-IF “de-identification” actually protected our privacy, so no one knows it’s our personal data—-but it doesn’t.

The ‘guidance’ allows easy ‘re-identification’ of health data. Publically available data bases of other personal information can be quickly compared electronically with ‘de-identified’ health data bases, so can be names re-attached, creating valuable, identifiable health data sets.

The “de-identification” methods OCR proposed are:

  • -The HIPAA “Safe-Harbor” method:  if 18 specific identifiers are removed (such as name, address, age, etc, etc), data can be released without patient consent. But .04% of the data can still be ‘re-identified’
  • -Certification by a statistical  “expert” that the re-identification risk is “small” allows release of data bases without patient consent.

o   There are no requirements to be an “expert”

o   There is no definition of “small risk”

Inadequate “de-identification” of health data makes it a big target for re-identification. Health data is so valuable because it can be used for job and credit discrimination and for targeted product marketing of drugs and expensive treatment. The collection and sale of intimately detailed profiles of every person in the US is a major model for online businesses.

The OCR guidance ignores computer science, which has demonstrated ‘de-identification’ methods can’t prevent re-identification. No single method or approach can work because more and more ‘personally identifiable information’ is becoming publically available, making it easier and easier to re-identify health data.  See: the “Myths and Fallacies of “Personally Identifiable Information” by Narayanan and Shmatikov,  June 2010 at: http://www.cs.utexas.edu/~shmat/shmat_cacm10.pdf Key quotes from the article:

  • -“Powerful re-identification algorithms demonstrate not just a flaw in a specific anonymization technique(s), but the fundamental inadequacy of the entire privacy protection paradigm based on “de-identifying” the data.”
  • -“Any information that distinguishes one person from another can be used for re-identifying data.”
  • -“Privacy protection has to be built and reasoned about on a case-by-case basis.”

OCR should have recommended what Shmatikov and Narayanan proposed:  case-by-case ‘adversarial testing’ by comparing a “de-identified” health data base to multiple publically available data bases to determine which data fields must be removed to prevent re-identification. See PPR’s paper on “adversarial testing” at: http://patientprivacyrights.org/wp-content/uploads/2010/10/ABlumberg-anonymization-memo.pdf

Simplest, cheapest, and best of all would be to use the stimulus billions to build electronic systems so patients can electronically consent to data use for research and other uses they approve of.  Complex, expensive contracts and difficult ‘work-arounds’ (like ‘adversarial testing’) are needed to protect patient privacy because institutions, not patients, control who can use health data. This is not what the public expects and prevents us from exercising our individual rights to decide who can see and use personal health information.

How Medical Identity Theft Can Give You a Decade of Headaches

See the full article at How Medical Identity Theft Can Give You a Decade of Headaches.

This article tells us a cautionary tale about how Arnold Salinas had his identity stolen by someone who took out medical care in his name. Now, any time he gets medical treatment, he has to be extremely careful that his records are actually his own or face the possibility that he will get the WRONG treatment.

“Medical identity theft affected an estimated 1.5 million people in the U.S. at a cost of $41.3 billion last year, according to the Ponemon Institute, a research center focused on privacy and data security. The crime has grown as health care costs have swelled and job cuts have left people without employer-subsidized insurance. Making matters worse: The complexity of the medical system has made it difficult for victims to clear their name.”

It is so important that patients control and are kept abreast of their medical records, but the current system does not make this easy. According to the article, medical identity theft cases are some of the most difficult to solve and can take years. What makes it so difficult is that “‘…you have to go provider by provider, hospital by hospital, office by office and correct each record,” said Sam Imandoust, a legal analyst with the Identity Theft Resource Center. ‘The frustrating part is while you’re going through and trying to clean up the records, the identity thief can continue to go around and get medical services in the victim’s name. Really there’s no way to effectively shut it down.’” Another problem is even finding out your identity has been stolen. According to Pam Dixon, founder of World Privacy Forum, “the fractured nature of the health care system makes medical identity theft hard to detect. Victims often don’t find out until two years after the crime, and cases can commonly stretch out a decade or longer”. Banks and other institutions are used to dealing with identity theft, but the medical industry isn’t equipped to handle this kind of infringement.

Onward and upward: ONC to automate Blue Button

See the full article in HealthcareITNews: Onward and upward: ONC to automate Blue Button

Why “Blue Button” matters: It is the critical first step to restore your control over personal health data.

  • -If we can’t get our data (via a “Blue Button”), we can’t use or control it—-much less check for errors.
  • -Few of us expect or know that today our sensitive health data flows to hidden businesses and users that have nothing to do with our health or treatment—which is why we need a map of health data flows:
    • -See Prof Sweeney explain this project in a brief video: http://tiny.cc/f466kw
    • -Today’s electronic health system allows millions of people who work for doctors, hospitals, insurers, health technology companies, and health data clearinghouses, etc, to use, disclose and sell our health data without consent.
  • -The current health technology system guarantees harms: like use of personal health data by employers and banks, ID theft and medical ID theft, and health data sales (see ABC World News story that shows the sale of diabetic patient data at: http://tiny.cc/un96kw ).

In 2001, the HIPAA Privacy Rule stated that patients should be able to download electronic copies of personal health data. Finally the federal government, through the Office of the National Coordinator for Health Information Technology (ONC), will actually require all electronic health records systems to let us do that.

  • -FYI—The box to click and download personal health information is known as a “Blue Button”. Some places already let patients do this (the VA system and MD Anderson for example).

When personal control over health data is restored, we can send our records to all the right places (for treatment and research) and NOT send records to hidden users and corporations that use it now to discriminate against us for jobs or credit, for ID theft, to impersonate us and use our health insurance to obtain treatment (medical ID theft), or for insurance, Medicare, and Medicaid fraud.

Aggressive New Texas Law Increases Fines, Training Rules; Could Hit CEs Nationwide

Aishealth.com explains the new Texas Medical Privacy Act that has recently been signed into law and quotes Dr. Deborah Peel of PPR in their latest report on patient privacy. The report is only available through subscription but below are a few key points and quotes from it. If you have a subscription to aishealth.com, you can view the full article at Aggressive New Texas Law Increases Fines, Training Rules; Could Hit CEs Nationwide.

“A new Texas law governing the privacy and security of protected health information, perhaps the broadest and among the toughest of such laws in the nation, went into effect on Sept. 1. The Texas Medical Privacy Act, signed into law June 17, 2011, by Gov. Rick Perry (R), not only increases requirements beyond those in HIPAA for organizations that are already covered entities (CEs), but greatly expands the number and type of Texas-based CEs required to comply with the privacy standards in HIPAA and adds a bunch of its own requirements. It contains separate mandates for breach notification of electronic PHI and penalties for violations.

The new law ‘is basically HIPAA, but applies to everyone who touches PHI’ and will have a ‘big impact on entities that get PHI but aren’t technically business associates – which are now effectively covered in Texas and must comply with HIPAA restrictions on use and disclosure,’ says longtime HIPAA expert and Texas attorney Jeff Drummond, a partner in the Dallas office of Jackson Walker LLP.
‘The biggest impact on CEs and BAs are the shorter timeframes for giving access to records and the training requirement,’ he says. And the new law, which amends two existing areas of Texas regulations, carries a punch: the law provides for ‘administrative, civil and criminal penalties’ that dwarf even those that were expanded under HITECH.

The law is likely to have an impact outside of Texas and spur privacy advocates to push for similar legislation in their states or at the national level. One of the most outspoken patient privacy advocates, Austin psychiatrist Deborah Peel, was among those who supported the law, testifying before elected officials during their deliberations in 2011.

‘We hope the Texas law inspires other states to write strong laws that emphatically reject hidden data flows that the data mining and data theft industry profit from at our expense,’ Peel tells RPP. ‘The states can restore
and strengthen personal control over health information – it’s what the public expects from health information technology systems and it’s our right to have [such control].’ Peel adds that “It’s also good business to prevent thousands of people from accessing PHI, [as] fraud, identity theft and medical identity theft are exploding.’”

Patient Trust in Confidentiality Affects Health Decisions

To view the full article by Pablo Valerio, please visit Enterprise Efficiency: Patient Trust in Confidentiality Affects Health Decisions

This article highlights a survey sponsored by FairWarning that looks at how “patient privacy considerations impact the actual delivery of healthcare” in the UK and US.

Key quotes from the story:

-”CIOs and healthcare providers need to ensure the best security, not only because it is the law, but because data breaches actually affect how honest a patient might be with a doctor and how quickly they will seek medical attention.”

-”It is not enough to comply with government regulations about data protection. If a data breach occurs patients are not going to check if the institution was following rules, they are going to blame their executives for allowing the breach to happen, regardless of the reasons.”

The survey, “UK: How Privacy Considerations Drive Patient Decisions and Impact Patient Care Outcomes; Trust in the confidentiality of medical records influences when, where, who and what kind of medical treatment is delivered to patients” cited in the article below compares attitudes about health information privacy in the UK and US.

Some key UK findings are:

-38.3 percent stated they have or would postpone seeking care for a sensitive medical condition due to privacy concerns

-More than half of patients stated that if they had a sensitive medical condition, they would withhold information from their care provider.

-Nearly 2 out of 5 stated they would postpone seeking care out of privacy concerns.

-45.1 percent would seek care outside of their community due to privacy concerns

-37 percent would travel… 30 miles or more, to avoid being treated at a hospital they did not trust

US vs UK patients:

-UK patients are almost twice as likely to withhold information from their care provider…if they had a poor record of protecting patient privacy.

-4 out of 10 UK patients versus nearly 3 out of 10 US patients … would put off seeking care … due to privacy concerns.

-97 percent of UK and US patients stated chief executives and healthcare providers have a legal and ethical responsibility to protect patients’ medical records from being breached.

Attackers Demand Ransom After Encrypting Medical Center’s Server

To view the full article by John E. Dunn, please visit CIO: Attackers Demand Ransom After Encrypting Medical Center’s Server

What happens to patients when their doctors can’t get their records because thieves encrypted them? Federal law has required strong health data security protections since 2002, but 80% of hospitals and practices don’t encrypt patient data. If The Surgeons of Lake County had been following the law and encrypted their records, this attack could not have happened.

Protecting Our Civil Rights in the Era of Digital Health

See the full article by William Pewen in The Atlantic: Protecting Our Civil Rights in the Era of Digital Health

Bill Pewen has written the BEST BRIEF HISTORY OF HOW HEALTH INFORMATION PRIVACY WAS ELIMINATED I HAVE EVER SEEN, from diagnoses to prescription records to DNA. Terrific to see this in the Atlantic!

He shows how technology-based discrimination works, and makes the case that selling people’s health information/profiles is a major business model for the largest technology/Internet corporations: “Millions [of people] are beginning to recognize that they are not the customers, but the product.”
“[A]dvancing technology was opening a virtual Pandora’s Box of new civil rights challenges. At the crux of these was the fact that scientific progress has been enabling increasingly sophisticated discrimination.” ………”Our experience with GINA helped to reveal the tip of an emerging threat — the use of modern data systems to create new forms of discrimination — and our concern focused on the use of personal medical data. While genetic data expresses probabilities, other parts of one’s medical record reflect established fact — an individual’s diagnoses, the medications one has used, and much more.”

“Genetic discrimination comprised just one of a number of game-changing technological challenges to civil rights. Confronting these presents new obstacles, and points to the need for a paradigm shift in our approach to prevent such inappropriate bias.”

He concluded with a call for “a 2nd civil rights bill of the 21st century”, based on key principles and tests to evaluate whether technology harms people:

Principles:
· First: “certain harmful acts must be clearly prohibited”

· Second: “the possession and use of personal medical data should be restricted without an individual’s consent”.

Harms tests:

To determine “whether an application of technology undermines existing civil rights statutes,…consider its potential to impose harm in terms of three tests.

· First: “the immutability of a trait. Profiling based on an unchangeable [genetic] characteristic should raise questions, as the ability of an individual to impact these is absent.”

·Second: “relevance…..[for example] we would not permit such irrelevant traits as race or gender to be used to discriminate in the hiring of flight crews.”

·Third: “the presumption of a zone of privacy. …neither personal medical information nor its correlates should be considered in the public domain.

Senator Snowe and her top health expert, Bill Pewen, are real privacy heroes, responsible for key new consumer privacy and security protections in the technology portion of the stimulus bill (HITECH). The bipartisan Coalition for Patient Privacy worked very closely with them to support consumer protections they championed.

Only 26 Percent of Americans Want Electronic Medical Records, Says Xerox Survey

Xerox kindly shared all three years of their annual Electronic Health Records (EHR) online surveys by Harris Interactive. The media, industry and government unrelentingly promote health technology as the latest, greatest best stuff.  But the public ain’t buying it.  They want smart phones, but they don’t  want EHRs.

Clearly the public is not very excited about EHRs; 74% don’t want them. They don’t want them because they understand the problems with EHRs so well.

To view the article, please visit Only 26 Percent of Americans Want Electronic Medical Records, Says Xerox survey

Not only do the surveys show a low percentage of Americans want electronic health records—but it’s remained low; this year at only 26%. Overall 85% of the public has “concerns” about EHRs this year. The surveys also asked about specific ‘concerns’. They found the public is concerned that health data security is poor, data can be lost or corrupted, records can be misused, and that outages or ‘computer problems’ can take records offline and compromise care.  See results below:

To the question do you want your medical records to be digital:

  • 26% said ‘yes’ in 2010
  • 28% said ‘yes’ in 2011
  • 26% said ‘yes’ in 2012

To the question do you have concerns about digital records:

  • 82% said ‘yes’ in 2010
  • 83% said ‘yes’ in 2011
  • 85% said ‘yes’ in 2012

To the question could your information be hacked:

  • 64%  said ‘yes’ in 2010
  • 65%  said ‘yes’ in 2011
  • 63%  said ‘yes’ in 2012

To the question could your digital medical records  be lost or corrupted:

  • 55% said ‘yes’ in 2010
  • 54% said ‘yes’ in 2011
  • 50% said ‘yes’ in 2012

To the question could your personal information be misused:

  • 57% said ‘yes’ in 2010
  • 52% said ‘yes’ in 2011
  • 51% said ‘yes’ in 2012

To the question could a power outage or computer problem prevent doctors from accessing my information:

  • 52% said ‘yes’ in 2010
  • 52% said ‘yes’ in 2011
  • 50% said ‘yes’ in 2012

Abercrombie signs Hawaii patient privacy protection law

To view the full article in Bizjournals.com by Vanessa Van Voorhis, please visit Abercrombie signs Hawaii patient privacy protection law.

The people of Hawaii just lost their rights to health privacy. The Hawaiian legislature replaced all its far stronger health privacy laws with HIPAA.

Like most of the public, Hawaiian lawmakers believe HIPAA protects privacy, but it doesn’t.  It hasn’t for 10 years. The key privacy protection in HIPAA  was eliminated in 2002. The media  has never reported this.

  • President Bush put HIPAA in place when he took office. At first, HIPAA required that others had to ask for consent before using or disclosing our health information for treatment, payment, or healthcare operations.

  • “The consent provisions…are replaced with a new provision…that provides regulatory permission for covered entities to use and disclose protected health information for treatment, payment, and healthcare operations.”  67 Fed. Reg. 53,183

That means millions of people who work at hospitals, doctors offices, labs, health plans, data clearinghouse, government agencies, pharmacies and other places that hold health records (“covered entities”) decide when to use and disclose them, not us.

This new law is a privacy disaster for Hawaiians. They will suffer:

  • loss of the privacy of sensitive information about their minds, bodies, and genes
  • generations of discrimination
  • embarrassment and loss of reputation
  • job, credit, and insurance discrimination
  • ID theft
  • medical ID theft (where others use their health insurance to pay for treatment or for insurance fraud)