Privacy Framework: A Practical Tool?

An interesting article about our Privacy Framework- to view the full article please visit Privacy Framework: A Practical Tool?

Some key quotes:

“The PPR Trust Framework is … designed to help organizations ensure that technology and IT systems align with the privacy requirements of critical importance to patients and reflect their legal and ethical rights to health information privacy,” Peel says.

“The framework was developed by a group within Patient Privacy Rights – the bipartisan Coalition for Patient Privacy – along with Microsoft and the consulting firm PricewaterhouseCoopers, Peel says. It was developed, tested and validated on Microsoft’s HealthVault personal health record platform.”

“Ensuring the privacy of patient data is a key concern for any healthcare IT vendor,” says Sean Nolan, distinguished engineer, Microsoft HealthVault. “Microsoft as a company advocates for a more standardized federal approach to the privacy of data, and this is especially true for the HealthVault team. We believe that it takes a deep corporate commitment to the privacy of patient data in order to support initiatives such as the PPR Trust Framework.”

Mostashari, policy committee take critical look at CommonWell

To view the full article, please visit: Mostashari, policy committee take critical look at CommonWell

The ONLY way patients/the public will trust health technology systems is if THEY control ‘interoperability’—-ie if THEY control their sensitive health data. Patients have strong rights to control exactly who can collect, use, and disclose their health data. This also happens to be what the public expects and wants MOST from HIT……The public has strong legal rights to control PHI, despite our flawed HIT systems.

The story below is about an attempt by large technology vendors and the government to maintain control over the nation’s sensitive health data. Institutional/government-sanctioned models like the CommonWell Alliance violate patients’ rights to control their medical records (from diagnoses to DNA to prescription records).  Patients should be able to:

  • -choose personal email addresses as their IDs, there is no need for Institutions to choose ID’s for us—email addresses on the Internet work very well as IDs
  • -download and store their health information from electronic records systems (EHRs)–required by HIPAA since 2001, but only now becoming reality via the Blue Button+ project
  • -email their doctors using Direct secure email

Today’s systems violate 2,400 years of ethics underlying the doctor-patient relationship and the practice of Medicine: ie Hippocrates’ discovery that patients would only be able to trust physicians with deeply personal information about their bodies and minds IF the doctors never shared that information without consent. That ‘ethic’—-ie, to guard the patient’s information and act as the patient’s agent and protector is codified in the Hippocratic Oath and embodied in American law and the AMA Code of Medical Ethics. Americans have strong rights to health information privacy which HIPAA has not wiped out (HIPAA is the FLOOR, not the CEILING for our privacy rights).

The public does NOT agree that their sensitive health data should be used without consent—they expect to control health information with rare legal exceptions. See: http://patientprivacyrights.or…. HUGE majorities believe that individuals alone should decide what data they want to share with whom—not one-size-fits-all law or policies.

Nor does the public agree to use of their personal health data for “research”—whether for clinical research about diseases or by industry for commercial use of the data via the ‘research and public health loopholes’ in HIPAA. Only 1% of the public agrees to unfettered use of personal health data for research. Read more about these survey results here.

The entire healthcare system depends TOTALLY on a two-person relationship, and whether there is trust between those two people. We must look at the fact that today’s HIT systems VIOLATE that personal relationship by making it ‘public’ via the choice of health technology systems designed for data mining and surveillance. Instead we need technology designed to ensure patient control over personal health information (with rare legal exceptions). When patients cannot trust their doctors, health professionals, or the flawed technology systems they use, the consequence is many millions of patients avoid or delay of treatment and hide information. Every year many millions of Americans take actions which CAUSE BAD OUTCOMES.

Current health technologies and data exchange systems cause millions of people annually to risk their health and lives, ie the technologies we are using now cause BAD OUTCOMES.

We have to face facts and design systems that can be trusted. Patient Privacy Rights’ Trust Framework details in 75 auditable criteria what it takes to be a trusted technology or systems. See:http://patientprivacyrights.or… or download the paper at:
http://ssrn.com/abstract=22316…

Sensitive data still pose special challenges

At a recent meeting of the National Health IT Policy Committee, the CEO of a large electronic health records (EHR) corporation said technology for “data segmentation”—which ensures patients control who sees and uses sensitive data—is something “vendors don’t know how to do.”  But that simply isn’t true. Vendors do know how to build that kind of technology, in fact it already exists.

At the same meeting, the National Coordinator for Health IT recognized the Department of Veterans Affairs and the Substance Abuse and Mental Health Services Administration for their “demonstration of technology developed for data segmentation and tagging for patient consent management”, but he seemed to forget that millions of people receiving mental health and addiction treatment have been using EHRS with consent and data segmentation technologies for over 12 years. Again, the technology already exists.

Facts:

  • -Technology is NOT the problem—it’s not too hard or too expensive to build or use consent and data segmentation technologies.
  • -Data segmentation and consent technologies exist:  the oldest example is EHRs used for millions of mental health and addiction treatment records for the past 12 years.
  • -All EHRs must be able to “segment” erroneous data to keep it from being disclosed and harming patients—that same technology can be used to “segment” sensitive health data.
  • -Data segmentation and consent technologies were demonstrated ‘live’ at the Consumer Choices Technology Hearing in 2010. See a video: http://nmr.rampard.com/hit/20100629/default.html
  • -Starting in 2001, HIPAA required data segmentation and consent technology for EHRs that keep “psychotherapy notes” separated from other health data.  “Psychotherapy notes” can ONLY be disclosed with patient permission.
  • -The 2013 amendments to HIPAA require EHRs to enable other situations where data must be segmented and consent is required. For example:
  • -If you pay out-of-pocket for treatment or for a prescription in order to keep your sensitive information private, technology systems must prevent your data from being disclosed to other parties.
  • -After the first time you are contacted by hospital fundraisers who saw your health data, you can opt-out and block the fundraisers from future access to your EHR.

The real problem is current  technology systems and data exchanges are not built to work the way the public expects them to—they violate Americans’ ethical and legal rights to health information privacy.

The public will discover that today’s health technologies and systems have fatal privacy flaws. The unintended consequence of using flawed technology is millions of people will avoid or delay treatment and hide information to keep their health information private and suffer from bad health outcomes.

US health technology should improve health and outcomes, not cause the health of millions to worsen.

How can the US fix the privacy flaws in health technology systems so EHRs and other health technologies can be trusted?

Framework Outlines Key Principles for Protecting Privacy of Patient Data

To view the full article, please visit Framework Outlines Key Principles for Protecting Privacy of Patient Data.

iHealthBeat released an article about the Privacy Rights framework explaining its goals and principles.

Key quote from the article:

“The framework aims to help health care organizations measure how well their IT systems and research projects meet certain best practices for protecting patient privacy.

Patient Privacy Rights eventually intends to develop a system to license organizations based on their privacy policies and practices.”

The full Privacy Trust Framework can be viewed here.

New Framework Details 15 Core Health Privacy Principles

To view the full article, please visit New Framework Details 15 Core Health Privacy Principles.

HealthDataManagement.com recently posted this article about Patient Privacy Rights’ Privacy Trust Framework. The article tells HealthDataManagement readers “The Framework is designed to help measure and test whether health information systems and research projects comply with best privacy practices in such areas as whether patients have control over their protected health information, an organization obtains meaningful consent before disclosing data and obtains new consent before secondary data use occurs, patients have the ability to selectively share data, and the organization uses servers housed in the United States, among other factors.”

The key principles for our Privacy Trust Framework:

*Patients can easily find, review and understand the privacy policy.

* The privacy policy fully discloses how personal health information will and will not be used by the organization. Patients’ information is never shared or sold without patients’ explicit permission.

* Patients decide if they want to participate.

* Patients are clearly warned before any outside organization that does not fully comply with the privacy policy can access their information.

* Patients decide and actively indicate if they want to be profiled, tracked or targeted.

* Patients decide how and if their sensitive information is shared.

* Patients are able to change any information that they input themselves.

* Patients decide who can access their information.

* Patients with disabilities are able to manage their information while maintaining privacy.

* Patients can easily find out who has accessed or used their information.

* Patients are notified promptly if their information is lost, stolen or improperly accessed.

* Patients can easily report concerns and get answers.

* Patients can expect the organization to punish any employee or contractor that misuses patient information.

* Patients can expect their data to be secure.

* Patients can expect to receive a copy of all disclosures of their information.

The full framework can be viewed at Privacy Rights Framework.

The Immortal Life of Henrietta Lacks, the Sequel

This is an amazing article written by Rebekah Skloot, author of ‘The Immortal Life of Henrietta Lacks’, demanding consent and trust.

Rebecca is right—-the only way Americans will trust researchers is when they are treated with respect and their rights of consent for use of genomes and genetic information is restored.

The public does not yet realize that they have no control over ALL sensitive health information in electronic systems. We have NO idea how many hundreds of data mining and research corporations are collecting and using our blood and body parts. We ALSO have no control over our sensitive health information in electronic systems violating hundreds of years of privacy rights.

This week the many stories about CVS showed employers can force employees to take blood tests, health screenings, and be forced into “wellness” programs–all of which REQUIRE collection of sensitive health information—which employees cannot control.

We have NO map of who collects and uses personal health data—Henrietta Lacks family was NEVER asked for consent to use her genome.

Contribute to build a map to track the thousands of hidden users of health data at: www.localhost:8888/pprold

Attend or watch the 3rd International summit on the Future of Health Privacy (free). Register at: www.healthprivacysummit.org

HIStalk News 3/22/13 – Quotes Dr. Deborah Peel on new CVS policy

To view the full article, please visit HIStalk News 3/22/13.

Key quote from the article:

“Patient Privacy Rights Founder Deborah Peel, MD calls a new CVS employee policy that charges employees who decline obesity checks $50 per month “incredibly coercive and invasive.” CVS covers the cost of an assessment of height, weight, body fat, blood pressure, and serum glucose and lipid levels, but also reserves the right to send the results to a health management firm even though CVS management won’t have access to the results directly. Peel says a lack of chain of custody requirements means that CVS could review the information and use it to make personnel decisions.”

CVS requiring employees to undergo weight, health assessment

To view the full article, please visit CVS requiring employees to undergo weight, health assessment.

Key quotes from the article:

“This is an incredibly coercive and invasive thing to ask employees to do,” Patient Privacy Rights founder Deborah Peel told the Boston Herald, noting that such policies are becoming more prevalent as health costs increase.

“Rising health care costs are killing the economy, and businesses are terrified,” she continued to the Herald. “Now, we’re all in this terrible situation where employers are desperate to get rid of workers who have costly health conditions, like obesity and diabetes.”

“While patient-privacy activists have cried foul, Michael DeAngelis, a CVS spokesman, explained that the goal is health.”

To learn more about the issue, please visit our Health Privacy Summit Website and register for the 3rd International Summit on the Future of Health Privacy.

CVS imposes health penalty if workers’ body weight is not reported or they don’t quit smoking

To view the full article, please visit CVS imposes health penalty if workers’ body weight is not reported or they don’t quit smoking.

CVS has instated a very invasive new policy of charging workers a hefty $600 dollar a year fine if they do not disclose sensitive health information to the company’s benefits firm. According to the article, “Under the new policy, nearly 200,000 CVS employees who obtain health insurance through the company will have to report their weight, blood sugar, blood pressure and cholesterol to WebMD Health Services Group, which provides benefits support to CVS.” However, if employees refuse, they will be charged an extra $50 a month in health insurance costs.

Patient Privacy Rights’ Dr. Deborah Peel tells the public, “‘This is an incredibly coercive and invasive thing to ask employees to do,’…’Rising healthcare costs are killing the economy, and businesses are terrified, Now, we’re all in this terrible situation where employers are desperate to get rid of workers who have costly health conditions, like obesity and diabetes.'”

To learn more about this issue, please visit our Health Privacy Summit Website and register for the 3rd International Summit on the Future of Health Privacy.

Re: Your Online Attention, Bought in an Instant

Natasha Singer unearths more about the instantaneous selling of intimately detailed profiles about Americans in her article in The New York Times: Your Online Attention, Bought in an Instant

Best case: We get more ‘targeted’ ads. We supposedly want personalized ads so badly that we willingly give up deeply intimate portraits about who we are to the hidden data mining industry forever. Really? When did we ever have ANY meaningful choice about who collects and sells our most intimate personal information? See Duhigg’s NYTimes story.

Worst case: Hidden, technology enabled discrimination prevents us from getting jobs and destroys our reputations before anyone will meet with us. Companies like Rubicon literally know more about us than our partners, our mothers or fathers, our best friends, our children or our psychoanalysts. This information is used to harm us—-read Prof Sweeney’s paper on how ads like “YOUR NAME, arrested?” pop up next to the names of African-Americans but NOT next to Anglo-sounding names. What happens when future employers see ads like that when searching for information about you online? Read her paper here.

HELP FIX THIS PRIVACY DISASTER
HELP BUILD a map that tracks all hidden users and sellers of our sensitive health information.
DONATE to the Harvard/Patient Privacy Rights’ research project at: https://org2.democracyinaction.org/o/6402/donate_page/donate-to-thedatamap

European citizens have far stronger protections for their sensitive health and personal data than US citizens.
Learn why and learn about solutions to strengthen US data protections. Register for free to attend the 3rd International Summit on the Future of Health Privacy June 5-6 in DC: www.healthprivacysummit.org