Privacy could ‘crash’ big data if not done right

April 15, 2014 | By Ashley Gold | FierceHealthIT

Privacy has the potential to crash big data before there’s a chance to get it right, and finding the right balance is key to future success, experts argued at a Princeton University event earlier this month.

The event, titled “Big Data and Health: Implications for New Jersey’s Health Care System” featured four panels exploring health, privacy, cost and transparency in regard to how big data can improve care and patient outcomes, according to an article on the university’s website.

“Privacy will crash big data if we don’t get it right,” Joel Reidenberg, visiting professor of computer science at Princeton and a professor at Fordham University’s School of Law, said at the event.

To view the full article, please visit Privacy could ‘crash’ big data if not done right

 

What You Need to Know About Patient Matching and Your Privacy and What You Can Do About It

Today, ONC released a report on patient matching practices and to the casual reader it will look like a byzantine subject. It’s not.

You should care about patient matching, and you will.

It impacts your ability to coordinate care, purchase life and disability insurance, and maybe even your job. Through ID theft, it also impacts your safety and security. Patient matching’s most significant impact, however, could be to your pocketbook as it’s being used to fix prices and reduce competition in a high deductible insurance system that makes families subject up to $12,700 of out-of-pocket expenses every year.

Patient matching is the healthcare cousin of NSA surveillance.

Health IT’s watershed is when people finally realize that hospital privacy and security practices are unfair and we begin to demand consent, data minimization and transparency for our most intimate information. The practices suggested by Patient Privacy Rights are relatively simple and obvious and will be discussed toward the end of this article.

Health IT tries to be different from other IT sectors. There are many reasons for this, few of them are good reasons. Health IT practices are dictated by HIPAA, where the rest of IT is either FTC or the Fair Credit Reporting Act. Healthcare is mostly paid by third-party insurance and so the risks of fraud are different than in traditional markets.

Healthcare is delivered by strictly licensed professionals regulated differently than the institutions that purchase the Health IT. These are the major reasons for healthcare IT exceptionalism but they are not a good excuse for bad privacy and security practices, so this is about to change.

Health IT privacy and security are in tatters, and nowhere is it more evident than the “patient matching” discussion. Although HIPAA has some significant security features, it also eliminated a patient’s right to consent and Fair Information Practice.

Patient matching by all sorts of health information aggregators and health information exchanges is involuntary and hidden from the patient as much as NSA surveillance is.

Patients don’t have any idea of how many databases are tracking our every healthcare action. We have no equivalent to the Fair Credit Reporting Act to cover these database operators. The databases are both public and private. The public ones are called Health Information Exchanges, All Payer Claims Databases, Prescription Drug Monitoring Programs, Mental Health Registries, Medicaid, and more.

The private ones are called “analytics” and sell $Billions of our aggregated data to hospitals eager to improve their margins, if not their mission.

The ONC report overlooks the obvious issue of FAIRNESS to the patient. The core of Fair Information Practice are Consent, Minimization and Transparency. The current report ignores all of these issues:

- Consent is not asked. By definition, patient matching is required for information sharing. Patient matching without patient consent leads to sharing of PHI without patient consent. The Consent form that is being used to authorize patient matching must list the actual parameters that will be used for the match. Today’s generic Notice of Privacy Practices are as inadequate as signing a blank check.

- Data is not minimized. Citizen matching outside of the health sector is usually based on a unique and well understood identifier such as a phone number, email, or SSN. To the extent that the report does not allow patients to specify their own matching criterion, a lot of extra private data is being shared for patient matching purposes. This violates data minimization.

- Transparency is absent. The patient is not notified when they are matched. This violates the most basic principles of error management and security. In banking or online services, it is routine to get a simple email or a call when a security-sensitive transaction is made.

This must be required of all patient matching in healthcare. In addition, patients are not given access to the matching database. This elementary degree of transparency for credit bureaus that match citizens is law under the Fair Credit Reporting Act and should be at least as strict in health care.

These elementary features of any EHR and any exchange are the watershed defining patient-centered health IT. If a sense of privacy and trust don’t push our service providers to treat patients as first-class users, then the global need for improved cybersecurity will have to drive the shift. Healthcare is critical infrastructure just as much as food and energy.

But what can you, as a patient. do to hasten your emancipation? I would start with this simple checklist:

Opt-out of sharing your health records unless the system offers:

  • Direct secure messaging with patients
  • Plain email or text notification of records matching
  • Patient-specified Direct email as match criterion
  • Your specific matching identifiers displayed on all consent forms
  • Online patient access to matchers and other aggregator databases

None of these five requirements are too hard. Google, Apple and your bank have done all of these things for years. The time has come for healthcare to follow suit.

Adrian Gropper, MD is Chief Technical Officer of Patient Privacy Rights and participates in Blue Button+, Direct secure messaging governance efforts and the evolution of patient-directed health information exchange.

Check out the Latest from Dr. Gropper, courtesy of The Healthcare Blog.

Guest Blog – The AOL Babies: Our Healthcare Crisis in a Nut

Check out the latest from Nic Terry, courtesy of HealthLawProf Blog.

Where does one start with AOL CEO Armstrong’s ridiculous and unfeeling justifications for changes in his company’s 401(k) plan. Cable TV and Twitter came out of the blocks fast with the obvious critiques. And the outrage only increased after novelist Deanna Fei took to Slate to identify her daughter as one of the subjects of Armstrong’s implied criticism. Armstrong has now apologized and reversed his earlier decision.

As the corporate spin doctors contain the damage, Armstrong’s statements likely will recede from memory, although I am still hoping The Onion will memorialize Armstrong’s entry into the healthcare debate (suggested headline, “CEO Discovers Nation’s Healthcare Crisis Caused by 25 Ounce Baby”). But supposing (just supposing) your health law students ask about the story in class this week. What sort of journey can you take them on?

First (but only if you are feeling particularly mean), you could start with HIPAA privacy. After all, intuitively it seemed strange to hear an employer publicly describing the serious health problems of employees’ family members. With luck your students will volunteer that the HIPAA Privacy Rule does not apply to employers (not “covered entities”). True, but AOL provided employees and their families with a health plan. Assume this was an employer-sponsored plan of some scale. It remains the case that the plan and not the employer is subject to the Privacy Rule, although following the Omnibus rule, the plan and its business associates are going to face increased regulation (such as breach notification, new privacy notices, etc). The employer’s responsibilities are to be found at 45 CFR 164.504 and primarily 164.504(f) (and here we descend deep into the HIPAA weeds). The employer must ensure that the plan sets out the plan members’ privacy rights viz-a-viz the employer. For plans like these the employer can be passed somewhat deindentied summary information (though for very limited purposes that don’t seem to include TV appearances). However, if the employer essentially administers the plan then things get more complicated. Firewalls are required between different groups of employees and employer-use of PHI is severely limited. By the way, and in fairness to Mr Armstrong, there are many things we don’t know about the AOL health plan, the source of his information about the “distressed babies,” whether any PHI had been deidentified, etc. Yet, at the very least AOL may have opened themselves up to the OCR asking similar questions and starting an investigation into how AOL treats enrollee information.

Second, this storm about the babies’ health insurance should provide a good basis for discussion of the various types of health insurance and their differential treatment by the Affordable Care Act. A large company likely will offer either a fully-insured or self-insured plan to its employees. If the latter, would your students have recommended reinsurance against claim “spikes” with a stop-loss policy? ACA should have relatively little impact on such plans or their cost except where the plans fall beneath the essential benefits floor. Contrast such plans with those traditionally offered on the individual market that are now being replaced with the lower cost (subject again to extra costs associated with essential benefits) health exchange-offered plans.

Third, this entire episode raises the question of health care costs and, specifically, the pricing of health care. On first hearing a million dollar price tag seems extraordinary. Yet as Ms. Fei noted in her Slate article, her daughter spent three months in a neonatal ICU and endured innumerable procedures and tests resulting in “a 3-inch thick folder of hospital bills that range from a few dollars and cents to the high six figures.” Now, the ACA may be criticized for not doing enough to cut costs (how about a quick pop quiz on what it does try to do?), but is there any truth to the argument that it raises health care costs? Recent investigative work by Steve Brill and fine scholarship by Erin Fuse Brown have highlighted both high prices and high differential pricing in health care. So why would a corporate executive (either directly or indirectly) blame high prices on the ACA? Are, for example, technology markets so different that the reasons for health care costs are under appreciated? And by extension, instead of fighting the ACA why are corporate CEOs not urging a second round of legislation aimed specifically at reducing the cost of healthcare for all? After all it is highly unlikley FFS pricing would be tolerated in their non-health domains. Or does such a group prefer the status quo and what Beatrix Hoffman critically terms as rationing by price?

Privacy Tools: Opting Out from Data Brokers

By Julia Angwin
ProPublica, Jan. 30, 2014

Data brokers have been around forever, selling mailing lists to companies that send junk mail. But in today’s data-saturated economy, data brokers know more information than ever about us, with sometimes disturbing results.

Earlier this month, OfficeMax sent a letter to a grieving father addressed to “daughter killed in car crash.” And in December, privacy expert Pam Dixon testified in Congress that she had found data brokers selling lists with titles such as “Rape Sufferers” and “Erectile Dysfunction sufferers.” And retailers are increasingly using this type of data to make from decisions about what credit card to offer people or how much to charge individuals for a stapler.

During my book research, I sought to obtain the data that brokers held about me. At first, I was excited to be reminded of the address of my dorm room and my old phone numbers. But thrill quickly wore off as the reports rolled in. I was equally irked by the reports that were wrong — data brokers who thought I was a single mother with no education — as I was by the ones that were correct — is it necessary for someone to track that I recently bought underwear online? So I decided to opt out from the commercial data brokers.

View the full article here, Privacy Tools: Opting Out from Data Brokers and get a list of the names of companies that track your information, links to their privacy pages, and instructions on how to opt out.

 

 

Texas Election 2014: Abbott Pledges to Safeguard DNA

“Texas gubernatorial frontrunner Greg Abbott recently released an extensive list of items he says he’ll push for once elected.. Ths list includes gun rights, campaign ethics, and blocking implementation of the Affordable Care Act, but the number one item is safeguarding your DNA according to KUT News.”

To view the full article, please visit: Texas Election 2014: Abbott Pledges to Safeguard DNA

Will Texans Own Their DNA?

Will Texans Own Their DNA?

Greg Abbott, candidate for Governor, thinks they should

 

On November 12th, Abbott released his “We the People Plan” for Texas. Clearly he’s heard from Texans who want tough new health data privacy protections.

 

Topping his list are four terrific privacy recommendations for health and genetic data:

  • “Recognize a property right in one’s own DNA.”
  • “Make state agencies, before selling database information, acquire the consent of any individual whose data is to be released.”
  • “Prohibit data resale and anonymous purchasing by third parties.”
  • “Prohibit the use of cross referencing techniques to identify individuals whose data is used as a larger set of information in an online data base.”

 

The Omnibus Privacy Rule operationalized the technology section of the stimulus bill. It also clarified that states can pass data privacy laws that are stronger than HIPAA (which is a very weak floor for data protections).

 

Texans would overwhelmingly support the new state data protection laws Abbott recommends . If elected, hopefully Abbott would also include strong penalties for violations. Contracts don’t enforce themselves. External auditing and proof of trustworthy practices should be required.

 

Is this the beginning of a national trend?  I think so.

 

The more the public learns about today’s health IT systems, the more they will reject health surveillance technologies that steal and sell sensitive personal health data.

Don’t Let EHR Vendors Own Your Data

“In a recent blog posting, John Moore and Rob Tholemeier of Chilmark Research ask the question: ‘Who’s Data is it Anyway?’ Your electronic health records data is not the property of your vendor and there are things you can do about it, they contend.”

To view the full article, please visit: Don’t Let EHR Vendors Own Your Data

Abbott’s Privacy Rights Proposals Draw Attention

“Attorney General Greg Abbott‘s support for more stringent privacy laws is getting some notice, as privacy rights activists say his proposals would lead to more protections for Texans. But concerns tied to the enforcement of the proposed policies are also being raised.”

To view the full article, please visit: Abbott’s Privacy Rights Proposals Draw Attention

 

Myth: The Benefits of Electronic Health Records Outweigh the Privacy Risks

Myth: The Benefits of Electronic Health Records Outweigh the Privacy Risks

Fact: It’s impossible to weigh the ‘benefits’ of EHRs vs. the ‘risks’ when we have no way of knowing what all the ‘risks’ are. Current health IT systems and data exchanges enable unlimited hidden use and sale of personal health data.

There is no map that tracks hidden disclosures of health data to secondary, tertiary, quaternary, etc, etc users. It’s crazy, but we have no ‘chain of custody’ for our most sensitive personal information, health data.

How can we make informed decisions about using EHRs when there is no map to track the 100s-1000s-1,000,000s of places our personal health information, from prescriptions to DNA to diagnoses, ends up?

Take a look at this website: http://www.theDataMap.org

·        Harvard Professor Latanya Sweeney leads this project to map the hidden flows of health data.

·        Patient Privacy Rights is a sponsor.

·        Not only is it impossible for individuals to make an informed decision about the risks and benefits of EHRs, but it’s ALSO impossible for Congress to create sane health reform and healthcare laws, formulate appropriate health and privacy policies that provide ironclad data privacy and security protections when we have no idea where PHI goes, who uses and sells it, or what it’s used for.

·        One example of not knowing where/how our personal health data ends up: Identifiable diabetic patient records are sold online for $14-$25 each. See: http://abcnews.go.com/Health/medical-records-private-abc-news-investigation/story?id=17228986&singlePage=true#.UFKTXVHUF-Y

If you think about privacy-destructive health IT,  it is the exact opposite of what patients expect. And it violates patients’ strong existing rights to health information privacy and control over personal health data:

·        One example: Patients give pharmacies a prescription for only one purpose: to fill their prescription. They don’t expect all 55,000 US pharmacies to sell every prescription, every night. The prescription data mining industry sells our easily identifiable prescription records collects 10s-100s of billions in revenue every year.

·        Another example: Patients expect physicians to keep their records private. They don’t expect physicians or EHRs to sell their sensitive data, treating patient data as another way to make money. But selling patient data is the business model of almost all EHRs, including Practice Fusion, Greenway, Cerner, Athena, GE Centricity, etc, etc. Patients give doctors information for one purpose only: to treat them. They don’t expect it to be used and sold by Business Associates, subcontractors, and subcontractors of the subcontractors for other purposes. Again, in the US patients have had a very long history of rights to health information privacy in law and ethics (the Hippocratic Oath).

 

Fact: the public will only trust health technology if they control their health data and can have real-time lists of those who use their health data. Hidden use of personal health data must stop. Users should ask our consent first. We need control, accountability and transparency to trust health technology.