Privacy Piracy Interview with PPR Founder

PRIVACY PIRACY HOST, MARI FRANK, ESQ. INTERVIEWS
DEBORAH PEEL, MARCH 11TH, 2013

On Monday, March 11th, 2013 Deborah C. Peel, MD, founder & chair of Patient Privacy Rights, was interviewed on Privacy Piracy with Mari Frank.

Among the topics of discussion were:

  1. The current state of Health Privacy
  2. How can individuals help to save and strengthen health privacy rights?
  3. What is the focus of the third International Summit on the Future of Health Privacy?

Re: Invasion of the Data Snatchers

Bill Keller’s NYTimes op-ed, “Invasion of the Data Snatchers,” is a fantastic piece on the hazy lines surrounding individual privacy in our new “surveillance economy.” Looking critically at The Journal News’ decision to publish the names and addresses of handgun permit holders in two nearby counties, as well as other instances in which people’s personal information is publicly shared, he asks a critical question: “What is the boundary between a public service and an invasion of privacy?” He then goes on to discuss the erosion of privacy and the challenges we face in determining “what information is worth defending and how to defend it.”

As the article says, “You can take your pick of the ways Facebook and Google are monetizing you by serving up your personal profile and browsing habits to advertisers for profit. Some of this feels harmless, or even useful — why shouldn’t my mobile device serve me ads tailored to my interests? But some of it is flat-out creepy. One of the more obnoxious trends is the custom-targeting of that irresistibly vulnerable market, our children.” Keller makes a good point—with so many different entities vying for a piece of your data, how can you know where to begin fighting back? And, it can be so overwhelming to think about the dirty underbelly of data sharing that it’s easier to say it’s no big deal in the long run, especially if you feel like you’re benefiting from it now.

For PPR, the bottom line is this: the erosion of our individual privacy is a critical issue. While some may be quick to dismiss such concerns, we have to remember that what we do now to protect our fundamental right to privacy matters. It matters to us in the present day and it matters to the futures of our children, our grandchildren, and so on…

Yes, there can be great benefits to the unparalleled connectivity and access people have to information in the rapidly shifting landscape of the digital era. At the same time, we have to make sure we establish clear boundaries and give people a say in the ways in which their information is accessed and used, particularly when it comes to sensitive data, like our personal health information. However, as Keller points out, protection of our privacy “doesn’t happen if we don’t demand it.”

This year, PPR will address a similar topic at its 3rd International Summit on the Future of Health Privacy: The Value of Health Data vs. Privacy — How Can the Conflict Be Resolved? We urge you to join us to be a part of the important conversations that will take place as we look at how our health information is valued, who has access to it, and what we can do to protect our privacy in an increasingly connected world.

OCR Could Include Cloud Provision in Forthcoming Omnibus HIPAA Rule

The quotes below are from an article written by Alex Ruoff in the Bloomberg Health IT Law and Industry Report.

“Deborah Peel, founder of Patient Privacy Rights, said few providers understand how HIPAA rules apply to cloud computing. This is a growing concern among consumer groups, she said, as small health practices are turning to cloud computing to manage their electronic health information. Cloud computing solutions are seen as ideal for small health practices as they do not require additional staff to manage information systems, Peel said.
Cloud computing for health care requires the storage of protected health information in the cloud—a shared electronic environment—typically managed outside the health care organization accessing or generating the data (see previous article).
Little is known about the security of data managed by cloud service providers, Nicolas Terry, co-director of the Hall Center for Law and Health at Indiana University, said. Many privacy advocates are concerned that cloud storage, because it often stores information on the internet, is not properly secured, Terry said. He pointed to the April 17 agreement between Phoenix Cardiac Surgery and HHS in which the surgery practice agreed to pay $100,000 to settle allegations it violated HIPAA Security Rules (see previous article).
Phoenix was using a cloud-based application to maintain protected health information that was available on the internet and had no privacy and security controls.

Demands for Guidance

Peel’s group, in the Dec. 19 letter, called for guidance “that highlights the lessons learned from the Phoenix Cardiac Surgery case while making clear that HIPAA does not prevent providers from moving to the cloud.”

Peel’s letter asked for:
• technical safeguards for cloud computing solutions, such as risk assessments of and auditing controls for cloud-based health information technologies;
• security standards that establish the use and disclosure of individually identifiable information stored on clouds; and
• requirements for cloud solution providers and covered entities to enter into a business associate agreement outlining the terms of use for health information managed by the cloud provider.”

Re: Open data is not a panacea

Regarding the story on MathBabe.org titled Open data is not a panacea

This story is a much-needed tonic to the heavy industry and government spin promoting ONLY the benefits of “open data” without mentioning the harms.

Quotes from the story:

  • When important data goes public, the edge goes to the most sophisticated data engineer, not the general public. The Goldman Sachs’s of the world will always know how to make use of “freely available to everyone” data before the average guy.
  • If there’s one thing I learned working in finance, it’s not to be naive about how information will be used. You’ve got to learn to think like an asshole to really see what to worry about.
  • So, if you’re giving me information on where public schools need help, I’m going to imagine using that information to cut off credit for people who live nearby. If you tell me where environmental complaints are being served, I’m going to draw a map and see where they aren’t being served so I can take my questionable business practices there.

Patient Privacy Rights’ goal is a major overhaul of U.S. health technology systems, so your health data is NOT OPEN DATA. Your health data should only be “open” and used with your knowledge and informed consent for purposes you agree with, like treatment and research. It will take a major overhaul for the public to trust health IT systems.

Why does Patient Privacy Rights advocate for personal control over health information and against “open data”? Answer:

For reasons that are NOT apparent, the healthcare industry shuns learning from computer scientists, mathematicians, and privacy experts about the harms and risks posed by today’s poorly designed “open” healthcare technology systems, the Internet, and the “surveillance economy”.

The health care industry and government shun facts like:

YOU can help build a data map so industry and government are forced to stop pretending that the health information of every person in the US is safe, secure, and private. Donate at: http://patientprivacyrights.org/donate/

EHRs and Patient Privacy- An Oxymoron? Psychiatric Times Cover Story

A recent article in the Psychiatric Times based on the 2nd International Summit on the Future of Health Privacy describes the major problems with EHRs and the consequences of the misuse of this technology. The article quotes both Dr. Peel and Dr. Scott Monteith as well as “Julie” when describing the flaws of EHRs and HIEs. The article is available by subscription only through Psychiatric Times, but here are some highlights and quotes from the article:

“The escalating use of electronic health records (EHRs) and health information exchanges (HIEs) is fraught with unintended and sometimes dire consequences—including medical coding errors and breaches of psychiatric patients’ privacy and confidentiality, according to [Dr. Peel and Dr. Monteith] who scrutinize the field”

“At the recent Second Annual International Summit on the Future of Health Privacy, psychiatrist Scott Monteith, MD, Clinical Assistant Professor in the Departments of Psychiatry and Family Medicine at Michigan State University and a medical informaticist, relayed the experience of a patient who discovered that her EHR erroneously reported a history of inhalant abuse. In reality, she had a history of  “caffeine intoxication.” After much investigation, the problem was identified. The DSM-IV-TR code (305.90) is used for 4 different diagnoses, including caffeine(Drug information on caffeine) intoxication and inhalant abuse, but the EHR’s printout only made the inhalant abuse diagnosis visible. Although the error was reported to the EHR vendor, the problem persists after almost 2 years.

“‘It is impossible for consumers to weigh the risks and benefits of using health IT and data exchanges when they have no idea where their data flows, who is using it or the purpose of its use,’ wrote Peel, a psychiatrist and psychoanalyst.”

“…Peel emphasized the importance of patients being able to control access to sensitive personal health information. The open source consent technologies, she explained, have been used for more than 12 years by many state mental health departments to exchange sensitive mental health and substance abuse data on some 4 million people in more than 8 states.”

“…’Millions of patients/year refuse to seek treatment when they know they cannot control where their data flows,” she wrote. “Any HIE or EHR that cannot selectively share data with the patient’s meaningful consent, withhold data without consent, AND withhold erroneous data is a failed system or technology. The refusal of certain health IT companies to build technologies that comply with the law and what patients expect shows very poor judgment.'”

If you wish to view the full article by Arline Kaplan and are a subscriber of Psychiatric Times, it can be found at Electronic Health Records and Patient Privacy- An Oxymoron?

ACC privacy breach victim ‘felt suicidal’

See the full article at Radio New Zealand: ACC privacy breach victim ‘felt suicidal’

This story is about a the effects of a data breach on New Zealand woman with very sensitive information in her electronic health records.

Like “Julie” who told the story of how her mental health records were exposed throughout Partners Healthcare system, the New Zealand woman is also a victim of sexual abuse. The New Zealand corporation holding her data sent it to someone else along with information on thousands of other people.

Similar to the experiences reported by US victims of health data breaches, the response to her data breach was underwhelming and irrelevant to the resulting damages: ie, emotional damage, loss of trust in the data holder, and no compensation for future ID theft or medical ID theft. No assurances or remediation were offered against future use or sale of her information, even though it often takes years to discover ID theft and medical ID theft. She was offered $250 as compensation, and the data holding corporation stated the amount was  “based on the extent of the breach and the level of harm or potential harm associated with it, as well as the client’s individual circumstances.” Clearly an inadequate, insensitive response.

Apparently inadequate, ineffective, insensitive responses to data breaches occur across the globe.

In the US, there is no “chain of custody” for any sensitive personal information and no way to control who gets it.  There is no way to track or prevent the flow of health information to hidden data users and thieves. BUT, you can help by adding to the map of hidden flows at theDataMap.org. US patients can’t weigh the risks vs. benefits of using electronic health systems without knowing who has copies of personal health records, from prescription records to DNA to diagnoses. WE don’t know if it is sold as intimate health profiles, used for ‘research’ or ‘data analytics’, for fraud, for extortion, or for ID or medical ID theft, etc, etc.

In the US, few Congressional leaders fight to restore patient control over health data and to ensure data security. Most in Congress votes for the hidden data mining industry against the public interest and against patients’ rights to health information privacy. Two leaders, the co-chairs of the House Privacy Caucus, Representatives Barton and Markey, received “Louis D. Brandeis Privacy Awards” at the 2nd International Summit on the Future of Health Privacy in Washington, DC on June 6th. See: www.healthprivacysummit.org or http://tiny.cc/nrhkgw for the agenda. The video of the Celebration of Privacy will soon be posted there.

Electronic health information is THE most valuable personal information on Earth—and US corporations and government see and use it without our knowledge or consent to make decisions about us. Tell Congress to put you in control over who can see your sensitive electronic health information—-to protect your job, reputation, and your children’s futures.

2-part story on “Julie” who spoke at the 2nd International Summit on the Future of Health Privacy

See the stories written by Joe Conn at ModernHealthcare.com: ‘Julie’ learns that privacy is more illusion than reality & How ‘Julie’ got a big surprise about medical records privacy

These stories matter for many reasons, not the least of which is that Partners is switching to Epic EHRs and Epic’s CEO has openly opposed data segmentation for years. She claims it’s impossible, too expensive, can’t be done, etc. Partners is about to spend hundreds of millions of dollars on a failed electronic health records system.

The claim that data segmentation cannot be done is incorrect. One example is the open source consent technologies used for over 12 years by many state mental health departments to exchange sensitive mental health and substance abuse data on over 4 million people in over 8 states (the states belong to the NDIIC). Further, the state of MA has very strong laws that require consent for the disclosure of mental health information (actually all 50 states do too).

Why would Partners’ choose a product that fails to protect patient privacy in a such a major way? This will prevent trust in doctors, hospitals, and worst—in ALL electronic systems. Millions of patients/year refuse to seek treatment when they know they cannot control where their data flows. Any HIE or EHR that cannot selectively share data with the patient’s meaningful consent, withhold data without consent, AND withhold erroneous data is a failed system or technology. The refusal of certain health IT companies to build technologies that comply with the law and what patients expect shows very poor judgment.

Who Should Have Access to Mental Health Records?

See the full story in The Globe: Who Should Have Access to Mental Health Records?

“Under federal health privacy laws, patients must sign a standard permission form for providers to share their medical information for purposes of treatment and billing. Policies on sharing psychiatric notes vary.

At Beth Israel Deaconess Medical Center, for example, psychiatrists decide whether to put notes in a locked area of the record, which other doctors can see only if they provide written justification.

At Partners, patients can ask that notes be restricted, but the organization evaluates the requests on a case-by-case basis. In the case of Julie — who does not want her full name published because she’s worried about being stigmatized — Partners eventually agreed to restrict access to the therapy notes written between 2002 and 2009. But the provider network would not automatically sequester future notes.

Julie told her story during the International Summit on the Future of Health Privacy, held in Washington, D.C. earlier this month and sponsored by advocacy group Patient Privacy Rights and Georgetown University Law Center’s O’Neill Institute for National and Global Health Law.

There is a push in health care policy toward more integration of mental and medical health services to better serve patient needs in all settings. Dr. Thomas Lee, head of the Partners’ physician organization, points to it in this story.

“Schizophrenia and Parkinson’s disease are both biochemical disorders of the brain,” he told Kowalczyk. “Why is one considered mental health and the other medical?’’

The catch is that privacy — trust, really — is paramount in serving people with sensitive mental health concerns. So, what’s the solution? How should records be handled to protect patients and provide the best possible care?”

Electronic Health Records: Balancing Progress and Privacy

See the full story on the Bioethics Forum Blog: Electronic Health Records: Balancing Progress and Privacy

“Regardless of the fate of the Affordable Care Act, it has set in motion a drive toward greater use of information technology, particularly with regard to electronic health records (EHRs). These technologies promise to increase the transmission, sharing, and use of health data across the health care system, thereby improving quality and reducing unnecessary costs. But they do not come without raising serious ethical questions, particularly those related to privacy. This was the topic of the 2nd International Summit on the Future of Health Privacy hosted by Patient Privacy Rights at Georgetown Law School on June 6 and 7. The two-day event brought together national and international experts on health privacy, technology, and law; patient advocates; industry experts; and top governmental officials to discuss whether there is an American health privacy crisis.”

Read more at The Hastings Center Bioethics Forum

Get information and updates about the International Summit on the Future of Health Privacy at www.HealthPrivacySummit.org

Re: “You for Sale, A Data Giant is mapping, and Sharing, the Consumer Genome”

Below comment in response to the New York Times article “You for Sale, A Data Giant is Mapping, and Sharing, the Consumer Genome.”

Acxiom is the poster-child for why tough new laws are needed to protect personal information on the Internet, in electronic systems, and on cell phones ASAP. No data should be collected about Americans without prior meaningful, informed consent.

Natasha Singer’s story is a must read to understand how the use of personal data threaten people’s jobs, reputations, and future opportunities. The information is analyzed and sold to those who want detailed real-time profiles of who we are, including the health of our minds and bodies. Data analytics enable Acxiom to create and sell far more intimate, detailed personality and behavioral portraits than our own mothers or analysts might know about us (and would never share).

Most people have never heard of Acxiom or other hidden data users. Today, most Americans have no idea that personal data is used by thousands of corporations and government agencies to make decisions about whether they will receive jobs or benefits.

Even though the hidden data mining industry began by using personal information to improve marketing and advertising, Acxiom proves that the kind and amounts amount of identifiable data being collected are simply unacceptable. As for the collection of health information, the data mining industry is clearly violating Americans’ very strong legal, Constitutional, and ethical rights to control and keep personal health data private. To the public, this is theft of personal health information.

On June 6th at the 2nd International Summit on the Future of Health Privacy, Professor Latanya Sweeney of the Harvard Data Privacy Lab along with Patient Privacy Rights introduced theDataMap.org. This project will enable citizens and whistleblowers to help create a detailed picture/map of where sensitive personal health information flows, from prescription records, to DNA, to diagnoses. Without a ‘chain of custody’ for our identifiable health data, it’s impossible to know who uses our data or why. A ‘chain of custody’ for personal health data could show us whether potential employers or banks had bought or received our health data, learn about the many ways the federal government uses health data as described in the Federal Health Information Technology Strategic Plans, and see the names of for-profit and public research and public health institutions that use personal health data.

Health data has long been used to discriminate against people for jobs, insurance, and credit. This fact is so well known that every year tens of millions of us refuse to get early diagnoses and treatment for cancer, depression, and sexually transmitted diseases. Hidden data flow causes bad health outcomes; treatment delays can be deadly. We need the same kind of control/consent over the use of electronic health data that we have always had for paper medical records.

US Internet and electronic systems have made us the most intimately surveilled people in the Free World. In Europe, strong laws and privacy-enhancing technologies prevent hidden data collection and data flow, so everyone benefits from technology and harms are avoided.

European standards for the collection of personal data were created after WW II, when data were used to decide who would die. Europeans consequently passed the world’s toughest data privacy laws, preventing personal data from being collected or used without consent.

Europe also established regional Data Privacy Commissioners to defend citizens’ rights to control the collection and use of personal information and ensure data accuracy. The US needs them too.

Unless we know where trillions of bytes of our personal data flow, who uses it and why, we cannot weigh the benefits and risks of using the Internet, electronic systems, or cell phones. It’s time for Congress to end the massive hidden flows of personal data.