What You Need to Know About Patient Matching and Your Privacy and What You Can Do About It

Today, ONC released a report on patient matching practices and to the casual reader it will look like a byzantine subject. It’s not.

You should care about patient matching, and you will.

It impacts your ability to coordinate care, purchase life and disability insurance, and maybe even your job. Through ID theft, it also impacts your safety and security. Patient matching’s most significant impact, however, could be to your pocketbook as it’s being used to fix prices and reduce competition in a high deductible insurance system that makes families subject up to $12,700 of out-of-pocket expenses every year.

Patient matching is the healthcare cousin of NSA surveillance.

Health IT’s watershed is when people finally realize that hospital privacy and security practices are unfair and we begin to demand consent, data minimization and transparency for our most intimate information. The practices suggested by Patient Privacy Rights are relatively simple and obvious and will be discussed toward the end of this article.

Health IT tries to be different from other IT sectors. There are many reasons for this, few of them are good reasons. Health IT practices are dictated by HIPAA, where the rest of IT is either FTC or the Fair Credit Reporting Act. Healthcare is mostly paid by third-party insurance and so the risks of fraud are different than in traditional markets.

Healthcare is delivered by strictly licensed professionals regulated differently than the institutions that purchase the Health IT. These are the major reasons for healthcare IT exceptionalism but they are not a good excuse for bad privacy and security practices, so this is about to change.

Health IT privacy and security are in tatters, and nowhere is it more evident than the “patient matching” discussion. Although HIPAA has some significant security features, it also eliminated a patient’s right to consent and Fair Information Practice.

Patient matching by all sorts of health information aggregators and health information exchanges is involuntary and hidden from the patient as much as NSA surveillance is.

Patients don’t have any idea of how many databases are tracking our every healthcare action. We have no equivalent to the Fair Credit Reporting Act to cover these database operators. The databases are both public and private. The public ones are called Health Information Exchanges, All Payer Claims Databases, Prescription Drug Monitoring Programs, Mental Health Registries, Medicaid, and more.

The private ones are called “analytics” and sell $Billions of our aggregated data to hospitals eager to improve their margins, if not their mission.

The ONC report overlooks the obvious issue of FAIRNESS to the patient. The core of Fair Information Practice are Consent, Minimization and Transparency. The current report ignores all of these issues:

- Consent is not asked. By definition, patient matching is required for information sharing. Patient matching without patient consent leads to sharing of PHI without patient consent. The Consent form that is being used to authorize patient matching must list the actual parameters that will be used for the match. Today’s generic Notice of Privacy Practices are as inadequate as signing a blank check.

- Data is not minimized. Citizen matching outside of the health sector is usually based on a unique and well understood identifier such as a phone number, email, or SSN. To the extent that the report does not allow patients to specify their own matching criterion, a lot of extra private data is being shared for patient matching purposes. This violates data minimization.

- Transparency is absent. The patient is not notified when they are matched. This violates the most basic principles of error management and security. In banking or online services, it is routine to get a simple email or a call when a security-sensitive transaction is made.

This must be required of all patient matching in healthcare. In addition, patients are not given access to the matching database. This elementary degree of transparency for credit bureaus that match citizens is law under the Fair Credit Reporting Act and should be at least as strict in health care.

These elementary features of any EHR and any exchange are the watershed defining patient-centered health IT. If a sense of privacy and trust don’t push our service providers to treat patients as first-class users, then the global need for improved cybersecurity will have to drive the shift. Healthcare is critical infrastructure just as much as food and energy.

But what can you, as a patient. do to hasten your emancipation? I would start with this simple checklist:

Opt-out of sharing your health records unless the system offers:

  • Direct secure messaging with patients
  • Plain email or text notification of records matching
  • Patient-specified Direct email as match criterion
  • Your specific matching identifiers displayed on all consent forms
  • Online patient access to matchers and other aggregator databases

None of these five requirements are too hard. Google, Apple and your bank have done all of these things for years. The time has come for healthcare to follow suit.

Adrian Gropper, MD is Chief Technical Officer of Patient Privacy Rights and participates in Blue Button+, Direct secure messaging governance efforts and the evolution of patient-directed health information exchange.

Check out the Latest from Dr. Gropper, courtesy of The Healthcare Blog.

Pairing patient privacy with health big data analytics

“Health privacy and security are often mentioned in tandem, but Deborah Peel, Founder and Chair of Patient Privacy Rights and Adrian Gropper, Chief Technology Officer of Patient Privacy Rights, took a different view in a recent Institute for Health Technology Transformation (iHT2) webcast.”

“The presentation, titled “Competing for Patient Trust and Data Privacy in the Age of Big Data” detailed a few of the nuances between patient data privacy and security and why privacy is so significant as healthcare organizations pull together huge data sets for health information exchange (HIE) and accountable care.”

To view the full article, please visit: Pairing patient privacy with health big data analytics

The webcast can be viewed at: Competing for Patient Trust and Data Privacy in the Age of Big Data Webinar

Patient Privacy Rights hires CTO

From the article and Q&A by Diana Manos in Health Care IT News: Patient Privacy Rights hires CTO

“Patient Privacy Rights appointed Adrian Gropper, MD as its first chief technology officer. Gropper is an expert in the regulated medical device field, an experienced medical informatics executive, and he has a long record of contributing to the development of state and national health information standards, according to a PPR news release.

Gropper, who has worked with federal initiatives and the Markle Foundation to help create the Direct Project’s secure email system and Blue Button technologies says he joins PPR because the challenges of runaway costs and deep inequities in the U.S. health system call for new information tools and inspired regulation.

“PPR’s deep respect for the medical profession and our total dedication to the patient perspective form the foundation for a series of policy and practice initiatives to shape health reform and electronic health systems,” Gropper said in the news release. “As a member of the PPR team, I look forward to driving a national consensus on the most difficult issues in the information age, including respectful patient identity, trustworthy consent, research acceleration, and effective public health.”

According to PPR, Gropper is a pioneer in privacy-preserving health information technology going as far back as the Guardian Angel Project at MIT in 1994. As CTO of one of the earliest personal health records companies, MedCommons, he actively participated in most of the PHR policy and standards initiatives of the past decade.”

See the full Q&A
See PPR’s Press Release