What You Need to Know About Patient Matching and Your Privacy and What You Can Do About It

Today, ONC released a report on patient matching practices and to the casual reader it will look like a byzantine subject. It’s not.

You should care about patient matching, and you will.

It impacts your ability to coordinate care, purchase life and disability insurance, and maybe even your job. Through ID theft, it also impacts your safety and security. Patient matching’s most significant impact, however, could be to your pocketbook as it’s being used to fix prices and reduce competition in a high deductible insurance system that makes families subject up to $12,700 of out-of-pocket expenses every year.

Patient matching is the healthcare cousin of NSA surveillance.

Health IT’s watershed is when people finally realize that hospital privacy and security practices are unfair and we begin to demand consent, data minimization and transparency for our most intimate information. The practices suggested by Patient Privacy Rights are relatively simple and obvious and will be discussed toward the end of this article.

Health IT tries to be different from other IT sectors. There are many reasons for this, few of them are good reasons. Health IT practices are dictated by HIPAA, where the rest of IT is either FTC or the Fair Credit Reporting Act. Healthcare is mostly paid by third-party insurance and so the risks of fraud are different than in traditional markets.

Healthcare is delivered by strictly licensed professionals regulated differently than the institutions that purchase the Health IT. These are the major reasons for healthcare IT exceptionalism but they are not a good excuse for bad privacy and security practices, so this is about to change.

Health IT privacy and security are in tatters, and nowhere is it more evident than the “patient matching” discussion. Although HIPAA has some significant security features, it also eliminated a patient’s right to consent and Fair Information Practice.

Patient matching by all sorts of health information aggregators and health information exchanges is involuntary and hidden from the patient as much as NSA surveillance is.

Patients don’t have any idea of how many databases are tracking our every healthcare action. We have no equivalent to the Fair Credit Reporting Act to cover these database operators. The databases are both public and private. The public ones are called Health Information Exchanges, All Payer Claims Databases, Prescription Drug Monitoring Programs, Mental Health Registries, Medicaid, and more.

The private ones are called “analytics” and sell $Billions of our aggregated data to hospitals eager to improve their margins, if not their mission.

The ONC report overlooks the obvious issue of FAIRNESS to the patient. The core of Fair Information Practice are Consent, Minimization and Transparency. The current report ignores all of these issues:

- Consent is not asked. By definition, patient matching is required for information sharing. Patient matching without patient consent leads to sharing of PHI without patient consent. The Consent form that is being used to authorize patient matching must list the actual parameters that will be used for the match. Today’s generic Notice of Privacy Practices are as inadequate as signing a blank check.

- Data is not minimized. Citizen matching outside of the health sector is usually based on a unique and well understood identifier such as a phone number, email, or SSN. To the extent that the report does not allow patients to specify their own matching criterion, a lot of extra private data is being shared for patient matching purposes. This violates data minimization.

- Transparency is absent. The patient is not notified when they are matched. This violates the most basic principles of error management and security. In banking or online services, it is routine to get a simple email or a call when a security-sensitive transaction is made.

This must be required of all patient matching in healthcare. In addition, patients are not given access to the matching database. This elementary degree of transparency for credit bureaus that match citizens is law under the Fair Credit Reporting Act and should be at least as strict in health care.

These elementary features of any EHR and any exchange are the watershed defining patient-centered health IT. If a sense of privacy and trust don’t push our service providers to treat patients as first-class users, then the global need for improved cybersecurity will have to drive the shift. Healthcare is critical infrastructure just as much as food and energy.

But what can you, as a patient. do to hasten your emancipation? I would start with this simple checklist:

Opt-out of sharing your health records unless the system offers:

  • Direct secure messaging with patients
  • Plain email or text notification of records matching
  • Patient-specified Direct email as match criterion
  • Your specific matching identifiers displayed on all consent forms
  • Online patient access to matchers and other aggregator databases

None of these five requirements are too hard. Google, Apple and your bank have done all of these things for years. The time has come for healthcare to follow suit.

Adrian Gropper, MD is Chief Technical Officer of Patient Privacy Rights and participates in Blue Button+, Direct secure messaging governance efforts and the evolution of patient-directed health information exchange.

Check out the Latest from Dr. Gropper, courtesy of The Healthcare Blog.

Did Tim Armstrong’s ‘Distressed Babies’ Comment Violate HIPAA Privacy Laws?

US citizens have a fundamental Constitutional right to health information privacy—but can’t easily sue. Only federal employees can sue under the Privacy Act of 1974, as vets did when a laptop with millions of health records was stolen. Even with strong state health privacy laws and state constitutional rights to privacy in place, it’s very hard to sue because most courts demand proof of monetary harm. This new digital disaster: exposing and/or selling sensitive personal health data–can’t be stopped without stronger, clearer federal laws. OR if US citizens boycott the corporations that violate their rights to health privacy.

-Deb

This blog written in response to the following article:

Did Tim Armstrong’s ‘Distressed Babies’ Comment Violate HIPAA Privacy Laws?
By Abby Ohlheiser
The Wire, February 10, 2014

Revelations by AOL Boss Raise Fears Over Privacy

By Natasha Singer
NYTimes.com, February 10, 2014

Tim Armstrong, the chief executive of AOL, apologized last weekend for publicly revealing sensitive health care details about two employees to explain why the online media giant had decided to cut benefits. He even reinstated the benefits after a backlash.

Tim Armstrong, the chief executive of AOL, apologized last weekend for publicly revealing sensitive health care details about two employees to explain why the online media giant had decided to cut benefits. He even reinstated the benefits after a backlash.

But patient and work force experts say the gaffe could have a lasting impact on how comfortable — or discomfited — Americans feel about bosses’ data-mining their personal lives.

Mr. Armstrong made a seemingly offhand reference to “two AOL-ers that had distressed babies that were born that we paid a million dollars each to make sure those babies were O.K.” The comments, made in a conference call with employees, brought an immediate outcry, raising questions over corporate access to and handling of employees’ personal medical data.

“This example shows how easy it is for employers to find out if employees have a rare medical condition,” said Dr. Deborah C. Peel, founder of Patient Privacy Rights, a nonprofit group in Austin, Tex. She urged regulators to investigate Mr. Armstrong’s disclosure about the babies, saying “he completely outed these two families.”

To view the full article, please visit Revelations by AOL Boss Raise Fears Over Privacy

 

Guest Blog – The AOL Babies: Our Healthcare Crisis in a Nut

Check out the latest from Nic Terry, courtesy of HealthLawProf Blog.

Where does one start with AOL CEO Armstrong’s ridiculous and unfeeling justifications for changes in his company’s 401(k) plan. Cable TV and Twitter came out of the blocks fast with the obvious critiques. And the outrage only increased after novelist Deanna Fei took to Slate to identify her daughter as one of the subjects of Armstrong’s implied criticism. Armstrong has now apologized and reversed his earlier decision.

As the corporate spin doctors contain the damage, Armstrong’s statements likely will recede from memory, although I am still hoping The Onion will memorialize Armstrong’s entry into the healthcare debate (suggested headline, “CEO Discovers Nation’s Healthcare Crisis Caused by 25 Ounce Baby”). But supposing (just supposing) your health law students ask about the story in class this week. What sort of journey can you take them on?

First (but only if you are feeling particularly mean), you could start with HIPAA privacy. After all, intuitively it seemed strange to hear an employer publicly describing the serious health problems of employees’ family members. With luck your students will volunteer that the HIPAA Privacy Rule does not apply to employers (not “covered entities”). True, but AOL provided employees and their families with a health plan. Assume this was an employer-sponsored plan of some scale. It remains the case that the plan and not the employer is subject to the Privacy Rule, although following the Omnibus rule, the plan and its business associates are going to face increased regulation (such as breach notification, new privacy notices, etc). The employer’s responsibilities are to be found at 45 CFR 164.504 and primarily 164.504(f) (and here we descend deep into the HIPAA weeds). The employer must ensure that the plan sets out the plan members’ privacy rights viz-a-viz the employer. For plans like these the employer can be passed somewhat deindentied summary information (though for very limited purposes that don’t seem to include TV appearances). However, if the employer essentially administers the plan then things get more complicated. Firewalls are required between different groups of employees and employer-use of PHI is severely limited. By the way, and in fairness to Mr Armstrong, there are many things we don’t know about the AOL health plan, the source of his information about the “distressed babies,” whether any PHI had been deidentified, etc. Yet, at the very least AOL may have opened themselves up to the OCR asking similar questions and starting an investigation into how AOL treats enrollee information.

Second, this storm about the babies’ health insurance should provide a good basis for discussion of the various types of health insurance and their differential treatment by the Affordable Care Act. A large company likely will offer either a fully-insured or self-insured plan to its employees. If the latter, would your students have recommended reinsurance against claim “spikes” with a stop-loss policy? ACA should have relatively little impact on such plans or their cost except where the plans fall beneath the essential benefits floor. Contrast such plans with those traditionally offered on the individual market that are now being replaced with the lower cost (subject again to extra costs associated with essential benefits) health exchange-offered plans.

Third, this entire episode raises the question of health care costs and, specifically, the pricing of health care. On first hearing a million dollar price tag seems extraordinary. Yet as Ms. Fei noted in her Slate article, her daughter spent three months in a neonatal ICU and endured innumerable procedures and tests resulting in “a 3-inch thick folder of hospital bills that range from a few dollars and cents to the high six figures.” Now, the ACA may be criticized for not doing enough to cut costs (how about a quick pop quiz on what it does try to do?), but is there any truth to the argument that it raises health care costs? Recent investigative work by Steve Brill and fine scholarship by Erin Fuse Brown have highlighted both high prices and high differential pricing in health care. So why would a corporate executive (either directly or indirectly) blame high prices on the ACA? Are, for example, technology markets so different that the reasons for health care costs are under appreciated? And by extension, instead of fighting the ACA why are corporate CEOs not urging a second round of legislation aimed specifically at reducing the cost of healthcare for all? After all it is highly unlikley FFS pricing would be tolerated in their non-health domains. Or does such a group prefer the status quo and what Beatrix Hoffman critically terms as rationing by price?

My Baby and AOL’s Bottom Line

By Deanna Fei
Slate Magazine, February 9, 2014

That “distressed baby” who Tim Armstrong blamed for benefit cuts? She’s my daughter.

Late last week, Tim Armstrong, the chief executive officer of AOL, landed himself in a media firestorm when he held a town hall with employees to explain why he was paring their retirement benefits. After initially blaming Obamacare for driving up the company’s health care costs, he pointed the finger at an unlikely target: babies.

Specifically, my baby.

“Two things that happened in 2012,” Armstrong said. “We had two AOL-ers that had distressed babies that were born that we paid a million dollars each to make sure those babies were OK in general. And those are the things that add up into our benefits cost. So when we had the final decision about what benefits to cut because of the increased healthcare costs, we made the decision, and I made the decision, to basically change the 401(k) plan.”

Within hours, that quote was all over the Internet. On Friday, Armstrong’s logic was the subject of lengthy discussions on CNN, MSNBC, and other outlets. Mothers’ advocates scolded him for gross insensitivity. Lawyers debated whether he had violated his employees’ privacy. Health care experts noted that his accounting of these “million-dollar babies” seemed, at best, fuzzy.

Plenty of smart, witty people took to Twitter to express their outrage—or mock outrage. The phrase “distressed babies” became practically an inside joke, as in, “How many distressed babies does AOL pay this guy?” A few AOL employees made cracks like this: “I swear I didn’t have any babies in 2012. Don’t hate me for messing up your 401(k).”

To view the full article, please visit My Baby and AOL’s Bottom Line

New CLIA rule talks the talk, but it doesn’t walk the walk

Deborah Peel, MD, Founder and Chair of Patient Privacy Rights

The federal government released an update to the CLIA rule this week that will require all labs to send test results directly to patients. But the regulations fail to achieve the stated intent to help patients. The rule allows labs to delay patient access to test results up to 30 days, and the process for directly obtaining personal test results from labs is not automated.

The new rule also fails to help patients in significant ways:

  • Real-time, online test results are not required. The federal government should have required all labs to use technology that benefits patients by enabling easy, automatic access to test results via the Internet in real-time. Unless we can obtain real-time access to test results, we can’t get a timely second opinion or verify the appropriate tests were ordered at the right time for our symptoms and diseases.
  • Labs are allowed to charge fees for providing test results to patients.  If labs can charge fees, they will not automate the process for patients to obtain results. Labs that automate patient access to test results online would incur a one-time cost.  After labs automate the process, human ‘work’ or time is no longer needed to provide patients their test results, so the labs would have no ongoing costs to recoup from patients.
  • Labs should be banned from selling, sharing, or disclosing patient test results without meaningful informed consent to anyone, except the physician who ordered the tests. This unfair and deceptive trade practice should be stopped. No patient expects labs to sell or share their test results with any other person or company except the physician who ordered the test(s).

This rule raises a question: why do so many federal rules for improving the healthcare system fail to require technologies that benefit patients?

Technology could provide enormous benefits to patients, but the US government caters to the healthcare and technology industries, instead of protecting patients.

Current US health IT systems actually facilitate the exploitation of patients’ records via technology. When HHS eliminated patient control over personal health data from HIPAA in 2002, it created a massive hidden US data broker industry that sells, shares , aggregates and discloses longitudinal patient profiles (for an example, see IMS’ SEC filing with details about selling 400M longitudinal patient profiles to 5K clients, including the U.S. government.

Meanwhile, even the most mundane, annoying, repetitive tasks patients must perform today–like filling out new paper forms with personal information every time we visit a doctor–are not automated for our convenience or to improve data quality and accuracy.

Shouldn’t IT improve patients’ experiences, treatment, and restore personal control over sensitive health information?

deb

You can also view a copy of this blog post here

WPF Report — Paying out of Pocket to Protect Health Privacy: A New but Complicated HIPAA Option; A Report on the HIPAA Right to Restrict Disclosure

San Diego & Washington, D.C. — The World Privacy Forum published a new report today that helps patients understand and use the new HIPAA right to restrict disclosure of their medical information to health plans when treatment is paid for out of pocket in full. The report contains practical advice and tips for patients about how to navigate the new right, which went into effect last year. Paying Out of Pocket to Protect Health Privacy: A New But complicated HIPAA Option; A Report on the HIPAA Right to Restrict Disclosure is one of the first reports on this topic written for patients.

“The new HIPAA right that lets patients restrict disclosures of their health information is actually not well known yet, and that needs to change,” said Pam Dixon, Executive Director of the World Privacy Forum. “This report has specific, concrete tips and information that will help patients use this important new right.” The report, written by Bob Gellman and Pam Dixon is available free of charge at www.worldprivacyforum.org.

Key points:

  • A patient has the right to prevent a health care provider from reporting information to a health insurer if the patient pays in full.
  • In order to prevent disclosure of information to a health plan, patients must make a Request to Restrict Disclosure.
  • Under the new changes to HIPAA, a patient has the firm right to demand, not just request, that a provider not disclose PHI to a health plan when certain conditions are met.
  • The conditions to be met can be complex, and work best with some advance planning.

Additional tips are in the report.

The bipartisan Coalition for Patient Privacy worked to get this key consumer protection into HITECH.

Bob Gellman and Pam Dixon are available to discuss tips and advice for patients on how to use the new HIPAA right.

Links:

The report Paying Out of Pocket to Protect Health Privacy: A New But complicated HIPAA Option; A Report on the HIPAA Right to Restrict Disclosure is available in PDF or in text.

Permalink: http://www.worldprivacyforum.org/2014/01/wpf-report-paying-out-of-pocket-to-protect-health-privacy/

Contact:

Bob Gellman 202-543-7023

Pam Dixon 760-712-4281

info@worldprivacyforum.org

Guest Article: The Causes of Digital Patient Privacy Loss in EHRs and Other Health IT Systems

Check out the latest from Shahid Shah, courtesy of The Healthcare IT Guy.

This past Friday I was invited by the Patient Privacy Rights (PPR) Foundation to lead a discussion about privacy and EHRs. The discussion, entitled “Fact vs. Fiction: Best Privacy Practices for EHRs in the Cloud,” addressed patient privacy concerns and potential solutions for doctors working with EHRs.

While we are all somewhat disturbed by the slow erosion of privacy in all aspects of our digital lives, the rather rapid loss of patient privacy around health data is especially unnerving because healthcare is so near and dear to us all. In order to make sure we provided some actionable intelligence during the PPR discussion, I started the talk off giving some of the reasons why we’re losing patient privacy in the hopes that it might foster innovators to think about ways of slowing down inevitable losses.

Here are some of the causes I mentioned on Friday, not in any particular order:

  • Most patients, even technically astute ones, don’t really understand the concept of digital privacy. Digital is a “cyber world” and not easy to picture so patients believe their data and privacy is protected when it may not be. I usually explain patient privacy in the digital world to non-techies using the analogy of curtains, doors, and windows. The digital health IT world of today is like walking into a patient’s room in a hospital in which it’s a large shared space with no curtains, no walls, no doors, etc. (even for bathrooms or showers!). In this imaginary world, every private conversation occurs so that others can hear it, all procedures are performed in front of others, etc. without the patient’s consent and their objections don’t even matter. If they can imagine that scenario, then patients will probably have a good idea about how digital privacy is conducted today — a big shared room where everyone sees and hears everything even over patients’ objections.
  • It’s faster and easier to create non-privacy-aware IT solutions than privacy-aware ones. Having built dozens of HIPAA-compliant and highly secure enterprise health IT systems for decades, my anecdotal experience is that when it comes to features and functions vs. privacy, features win. Product designers, architects, and engineers talk the talk but given the difficulties of creating viable systems in a coordinated, integrated digital ecosystem it’s really hard to walk the privacy walk  Because digital privacy is so hard to describe even in simple single enterprise systems, the difficulty of describing and defining it across multiple integrated systems is often the reason for poor privacy features in modern systems.
  • It’s less expensive to create non-privacy-aware IT solutions. Because designing privacy into the software from the beginning is hard and requires expensive security resources to do so, we often see developers wait until the end of the process to consider privacy. Privacy can no more be added on top of an existing system than security can — either it’s built into the functionality or it’s just going to be missing. Because it’s cheaper to leave it out, it’s often left out.
  • The government is incentivizing and certifying functionality over privacy and security. All the meaningful use certification and testing steps are focused too much on prescribed functionality and not enough on data-centric privacy capabilities such as notifications, disclosure tracking, and compartmentalization. If privacy was important in EHRs then the NIST test plans would cover that. Privacy is difficult to define and even more difficult to implement so the testing process doesn’t focus on it at this time.
  • Business models that favor privacy loss tend to be more profitable. Data aggregation and homogenization, resale, secondary use, and related business models tend to be quite profitable. The only way they will remain profitable is to have easy and unfettered (low friction) ways of sharing and aggregating data. Because enhanced privacy through opt-in processes, disclosures, and notifications would end up reducing data sharing and potentially reducing revenues and profit, we see that privacy loss is going to happen with inevitable rise of EHRs.
  • Patients don’t really demand privacy from their providers or IT solutions in the same way they demand other things. We like to think that all patients demand digital privacy for their data. However, it’s rare for patients to choose physicians, health systems, or other care providers based on their privacy views. Even when privacy violations are found and punished, it’s uncommon for patients to switch to other providers.
  • Regulations like HIPAA have made is easy for privacy loss to occur. HIPAA has probably done more to harm privacy over the past decade than any other government regulations. More on this in a later post.

The only way to improve privacy across the digital spectrum is to realize that health providers need to conduct business in a tricky intermediary-driven health system with sometimes conflicting business goals like reduction of medical errors or lower cost (which can only come with more data sharing, not less). Digital patient privacy is important but there are many valid reasons why privacy is either hard or impossible to achieve in today’s environment. Unless we intelligently and honestly understand why we lose patient privacy we can’t really create novel and unique solutions to help curb the loss.

What do you think? What other causes of digital patient privacy loss would you add to my list above?

Courtesy of The Healthcare IT Guy.

Should You Lose Your Gun Rights If You Visit a Shrink?

From Michael E. Hammond in the Washington Times: Obama’s gun-control dictate on ‘mental health’ threatens veterans’ rights

 

In a preternatural example of tone-deafness, an administration under fire for snooping into Americans’ privacy is now proposing to waive federal privacy laws so psychiatrists can report their gun-owning patients to the government.

The Department of Health and Human Service’s “notice of proposed rule-making,” floated by the White House in a Friday media dump, would waive portions of the federal Health Insurance Portability and Accountability Act (HIPAA) to allow psychiatrists to report their patients to the FBI’s gun-ban blacklist (the NICS system) on the basis of confidential communications.

The 1968 Gun Control Act bans guns for anyone who is “adjudicated as a mental defective or … committed to a mental institution.”

Unfortunately, under 2008 NICS Improvement Act, drafted by Sen. Charles E. Schumer, New York Democrat, and its regulations, that “adjudication” can be made by any “other lawful authority.” This means a diagnosis by a single psychiatrist in connection with a government program.

To read the full article, please click here.

Company That Knows What Drugs Everyone Takes Going Public

Nearly every time you fill out a prescription, your pharmacy sells details of the transaction to outside companies which compile and analyze the information to resell to others. The data includes age and gender of the patient, the name, address and contact details of their doctor, and details about the prescription.

A 60-year-old company little known by the public, IMS Health, is leading the way in gathering this data. They say they have assembled “85% of the world’s prescriptions by sales revenue and approximately 400 million comprehensive, longitudinal, anonymous patient records.”

IMS Health sells data and reports to all the top 100 worldwide global pharmaceutical and biotechnology companies, as well as consulting firms, advertising agencies, government bodies and financial firms. In a January 2nd filing to the Security and Exchange Commission announcing an upcoming IPO, IMS said it processes data from more 45 billion healthcare transactions annually (more than six for each human on earth on average) and collects information from more than 780,000 different streams of data worldwide.

Deborah Peel, a Freudian psychoanalyst who founded Patient Privacy Rights in Austin, Texas, has long been concerned about corporate gathering of medical records.

“I’ve spent 35 years or more listening to how people have been harmed because their records went somewhere they didn’t expect,” she says. “It got to employers who either fired them or demoted them or used the information to destroy their reputation.”

“It’s just not right. I saw massive discrimination in the paper age. Exponential isn’t even a big enough word for how far and how much the data is going to be used in the information age,” she continued. “If personal health data ‘belongs’ to anyone, surely it belongs to the individual, not to any corporation that handles, stores, or transmits that information.”

To view the full article please visit: Company That Knows What Drugs Everyone Takes Going Public