UofL professor wins health information privacy award

Patient Privacy Rights, a leading health privacy advocacy organization, will award one of its two annual Louis D. Brandeis Privacy Awards to University of Louisville professor Mark A. Rothstein on June 5 in conjunction with the Third International Summit on the Future of Health Privacy at the Georgetown University Law Center in Washington.

Established in 2012, the award is given with the approval of the Brandeis family and recognizes significant intellectual, cultural, legal, scholarly, and technical contributions to the field of health information privacy.

Rothstein holds the Herbert F. Boehl Chair of Law and Medicine at the UofL School of Medicine, and he also teaches at UofL’s Brandeis School of Law. The award’s ties to Brandeis make it especially meaningful to him, he said.

Leader of Hospital Identity Theft Ring Sentenced

It’s impossible to stop the tsunami of fraud, ID theft, and medical ID theft until we rebuild US health IT systems to prevent open access to millions of patient records by thousands of hospital and insurance company employees.
Systems should be re-built to allow ONLY those few people who are directly involved with a patient’s treatment to access their health records.

  • ·         ONLY those who carry out the orders of the patient’s physician should be able to access that patient’s electronic health records
  • ·         the other hundreds or thousands of hospital system employees and staff members should not be physically or technically able to access that patient’s records
  • ·         When a patient is admitted, one physician is in charge of diagnosis and treatment.
  • ·         All people the attending physician orders to treat the patient (nurses, consultants, respiratory therapists, etc, etc) work for that physician, the “captain of the ship”

Health data cannot possibly be protected when thousands of people have access to millions of patient records.  Employees of the hundreds of separate health technologies used by every hospital also have open access to millions of patient records.
The more people have access to sensitive personal health data, the easier it is to steal, sell, misuse it.

The Right to Obtain Restrictions Under the HIPAA/HITECH Rule: A Return to the Ethical Practice of Medicine

To view the full article, please visit: The Right to Obtain Restrictions Under the HIPAA/HITECH Rule: A Return to the Ethical Practice of Medicine.

Great explanation of how industry has fought to influence those in government that write the ‘rules’ for how federal law works in practice. The key industry tactic is to complain that complying with the law is too costly or impossible or would take too much time. For reasons we don’t understand, the government agency that writes the ‘rules’ takes the side of industry rather than defending patients.

athenahealth and Mashery team up for health developer-friendly API initiative

To view the full article, please visit athenahealth and Mashery team up for health developer-friendly API initiative.

Electronic health records (EHRs) companies allow access to patients sensitive health data and sensitive information about physicians’  practices so technology companies can develop applications.

Applications have the potential to be useful to physicians and patients but at what cost to privacy? Will EHR “apps” secretly collect and sell people’s information the way Smartphone apps collect and sell contact, GPS data and more?  We now know the business model for many technologies is selling intimate personal data.

Quotes:

  • ·athenahealth will open “access to doctors’ appointment data, patient’s medical history (anonymized) , billing information and more”,
  • ·“the company hopes developers will be able to create an ecosystem of apps on top of athenahealth’s EMR service”
  • ·“Other EMR providers, including Allscripts and Greenway, have also opened up their APIs to developers and created app marketplaces.”

The press release on this athenahealth project stated, We’re providing the data and knowledge from our cloud-based network, a captive audience for developers to innovate for, and an online sandbox to do it all in.”

  • ·Who are the “captives”? athenahealth’s 40,000 physicians and their 100’s of thousands of patients

QUESTIONS:

  • ·When were the “captive” patients asked for consent for strangers who want to use and monetize their health records?
  • ·When were “captive” physicians asked consent for strangers to use information about their practices, what they charge, who they treat, how they treat patients, how they are paid by whom, and much more?
  • ·Why does athenahealth claim that patient data is “anonymized”—-when its impossible to prevent “anonymized” patient records from easy re-identification?

Many electronic health record (EHR) companies allow access/or sell sensitive patient data to technology developers and other companies.

BROADER QUESTIONS

  • ·When did the public learn about, debate, or agree to the use of their sensitive patient data by technology companies to build products?
  • ·Why do technology companies claim that “anonymization” and “de-identification” of health data works, when computer science has clearly proved them wrong?
  • ·How is the identifiable health data of hundreds of thousands of patients protected from any OTHER uses the technology developers decide to use it for?
  • ·How can the public weigh the risks and harms vs. benefits of using EHRs when there is no ‘chain of custody’ for our health data and no data map that tracks the thousands of HIDDEN users of our personal health information?
  • See Harvard Prof Latanya Sweeney explain the need for a data map at: http://tiny.cc/5pjqvw
    • -Attend or watch via live-streaming video the 2103 International Summit on the Future of Health Privacy in Washington DC June 5-6 to see the first data map Prof Sweeney’s team has built. Registration to attend or watch is free at: www.healthprivacytsummit.org

Privacy Framework: A Practical Tool?

An interesting article about our Privacy Framework- to view the full article please visit Privacy Framework: A Practical Tool?

Some key quotes:

“The PPR Trust Framework is … designed to help organizations ensure that technology and IT systems align with the privacy requirements of critical importance to patients and reflect their legal and ethical rights to health information privacy,” Peel says.

“The framework was developed by a group within Patient Privacy Rights – the bipartisan Coalition for Patient Privacy – along with Microsoft and the consulting firm PricewaterhouseCoopers, Peel says. It was developed, tested and validated on Microsoft’s HealthVault personal health record platform.”

“Ensuring the privacy of patient data is a key concern for any healthcare IT vendor,” says Sean Nolan, distinguished engineer, Microsoft HealthVault. “Microsoft as a company advocates for a more standardized federal approach to the privacy of data, and this is especially true for the HealthVault team. We believe that it takes a deep corporate commitment to the privacy of patient data in order to support initiatives such as the PPR Trust Framework.”

Mostashari, policy committee take critical look at CommonWell

To view the full article, please visit: Mostashari, policy committee take critical look at CommonWell

The ONLY way patients/the public will trust health technology systems is if THEY control ‘interoperability’—-ie if THEY control their sensitive health data. Patients have strong rights to control exactly who can collect, use, and disclose their health data. This also happens to be what the public expects and wants MOST from HIT……The public has strong legal rights to control PHI, despite our flawed HIT systems.

The story below is about an attempt by large technology vendors and the government to maintain control over the nation’s sensitive health data. Institutional/government-sanctioned models like the CommonWell Alliance violate patients’ rights to control their medical records (from diagnoses to DNA to prescription records).  Patients should be able to:

  • -choose personal email addresses as their IDs, there is no need for Institutions to choose ID’s for us—email addresses on the Internet work very well as IDs
  • -download and store their health information from electronic records systems (EHRs)–required by HIPAA since 2001, but only now becoming reality via the Blue Button+ project
  • -email their doctors using Direct secure email

Today’s systems violate 2,400 years of ethics underlying the doctor-patient relationship and the practice of Medicine: ie Hippocrates’ discovery that patients would only be able to trust physicians with deeply personal information about their bodies and minds IF the doctors never shared that information without consent. That ‘ethic’—-ie, to guard the patient’s information and act as the patient’s agent and protector is codified in the Hippocratic Oath and embodied in American law and the AMA Code of Medical Ethics. Americans have strong rights to health information privacy which HIPAA has not wiped out (HIPAA is the FLOOR, not the CEILING for our privacy rights).

The public does NOT agree that their sensitive health data should be used without consent—they expect to control health information with rare legal exceptions. See: http://patientprivacyrights.or…. HUGE majorities believe that individuals alone should decide what data they want to share with whom—not one-size-fits-all law or policies.

Nor does the public agree to use of their personal health data for “research”—whether for clinical research about diseases or by industry for commercial use of the data via the ‘research and public health loopholes’ in HIPAA. Only 1% of the public agrees to unfettered use of personal health data for research. Read more about these survey results here.

The entire healthcare system depends TOTALLY on a two-person relationship, and whether there is trust between those two people. We must look at the fact that today’s HIT systems VIOLATE that personal relationship by making it ‘public’ via the choice of health technology systems designed for data mining and surveillance. Instead we need technology designed to ensure patient control over personal health information (with rare legal exceptions). When patients cannot trust their doctors, health professionals, or the flawed technology systems they use, the consequence is many millions of patients avoid or delay of treatment and hide information. Every year many millions of Americans take actions which CAUSE BAD OUTCOMES.

Current health technologies and data exchange systems cause millions of people annually to risk their health and lives, ie the technologies we are using now cause BAD OUTCOMES.

We have to face facts and design systems that can be trusted. Patient Privacy Rights’ Trust Framework details in 75 auditable criteria what it takes to be a trusted technology or systems. See:http://patientprivacyrights.or… or download the paper at:
http://ssrn.com/abstract=22316…

Sensitive data still pose special challenges

At a recent meeting of the National Health IT Policy Committee, the CEO of a large electronic health records (EHR) corporation said technology for “data segmentation”—which ensures patients control who sees and uses sensitive data—is something “vendors don’t know how to do.”  But that simply isn’t true. Vendors do know how to build that kind of technology, in fact it already exists.

At the same meeting, the National Coordinator for Health IT recognized the Department of Veterans Affairs and the Substance Abuse and Mental Health Services Administration for their “demonstration of technology developed for data segmentation and tagging for patient consent management”, but he seemed to forget that millions of people receiving mental health and addiction treatment have been using EHRS with consent and data segmentation technologies for over 12 years. Again, the technology already exists.

Facts:

  • -Technology is NOT the problem—it’s not too hard or too expensive to build or use consent and data segmentation technologies.
  • -Data segmentation and consent technologies exist:  the oldest example is EHRs used for millions of mental health and addiction treatment records for the past 12 years.
  • -All EHRs must be able to “segment” erroneous data to keep it from being disclosed and harming patients—that same technology can be used to “segment” sensitive health data.
  • -Data segmentation and consent technologies were demonstrated ‘live’ at the Consumer Choices Technology Hearing in 2010. See a video: http://nmr.rampard.com/hit/20100629/default.html
  • -Starting in 2001, HIPAA required data segmentation and consent technology for EHRs that keep “psychotherapy notes” separated from other health data.  “Psychotherapy notes” can ONLY be disclosed with patient permission.
  • -The 2013 amendments to HIPAA require EHRs to enable other situations where data must be segmented and consent is required. For example:
  • -If you pay out-of-pocket for treatment or for a prescription in order to keep your sensitive information private, technology systems must prevent your data from being disclosed to other parties.
  • -After the first time you are contacted by hospital fundraisers who saw your health data, you can opt-out and block the fundraisers from future access to your EHR.

The real problem is current  technology systems and data exchanges are not built to work the way the public expects them to—they violate Americans’ ethical and legal rights to health information privacy.

The public will discover that today’s health technologies and systems have fatal privacy flaws. The unintended consequence of using flawed technology is millions of people will avoid or delay treatment and hide information to keep their health information private and suffer from bad health outcomes.

US health technology should improve health and outcomes, not cause the health of millions to worsen.

How can the US fix the privacy flaws in health technology systems so EHRs and other health technologies can be trusted?

Groups develop privacy framework for health IT

To view the full article, please visit Groups develop privacy framework for health IT.

An article written at ModernHealthcare.com about our new Privacy Trust Framework explains how the framework came into being and what it’s major principles are.

Key quote from the article:

“‘This comes from what the American public wants and was devised by Microsoft and PricewaterhouseCoopers,’ Peel said. ‘Some of the bigger corporations see the future as the public controlling things. Microsoft wanted to distinguish itself from Google Health (its one-time rival as a developer of PHR platforms) and wanted HealthVault to be the privacy place and wanted to compete in that way.’ PricewaterhouseCoopers saw a future auditing opportunity, she said. ‘We’re now moving with the Blue Button where patients can access their information and control it. The ultimate consumer is the patient.'”

The Privacy Trust Framework can be found here.

Framework Outlines Key Principles for Protecting Privacy of Patient Data

To view the full article, please visit Framework Outlines Key Principles for Protecting Privacy of Patient Data.

iHealthBeat released an article about the Privacy Rights framework explaining its goals and principles.

Key quote from the article:

“The framework aims to help health care organizations measure how well their IT systems and research projects meet certain best practices for protecting patient privacy.

Patient Privacy Rights eventually intends to develop a system to license organizations based on their privacy policies and practices.”

The full Privacy Trust Framework can be viewed here.

New Framework Details 15 Core Health Privacy Principles

To view the full article, please visit New Framework Details 15 Core Health Privacy Principles.

HealthDataManagement.com recently posted this article about Patient Privacy Rights’ Privacy Trust Framework. The article tells HealthDataManagement readers “The Framework is designed to help measure and test whether health information systems and research projects comply with best privacy practices in such areas as whether patients have control over their protected health information, an organization obtains meaningful consent before disclosing data and obtains new consent before secondary data use occurs, patients have the ability to selectively share data, and the organization uses servers housed in the United States, among other factors.”

The key principles for our Privacy Trust Framework:

*Patients can easily find, review and understand the privacy policy.

* The privacy policy fully discloses how personal health information will and will not be used by the organization. Patients’ information is never shared or sold without patients’ explicit permission.

* Patients decide if they want to participate.

* Patients are clearly warned before any outside organization that does not fully comply with the privacy policy can access their information.

* Patients decide and actively indicate if they want to be profiled, tracked or targeted.

* Patients decide how and if their sensitive information is shared.

* Patients are able to change any information that they input themselves.

* Patients decide who can access their information.

* Patients with disabilities are able to manage their information while maintaining privacy.

* Patients can easily find out who has accessed or used their information.

* Patients are notified promptly if their information is lost, stolen or improperly accessed.

* Patients can easily report concerns and get answers.

* Patients can expect the organization to punish any employee or contractor that misuses patient information.

* Patients can expect their data to be secure.

* Patients can expect to receive a copy of all disclosures of their information.

The full framework can be viewed at Privacy Rights Framework.