Congress sits on hands as health privacy wanes

By David Pittman | Politico.com | 6/12/14 5:00 AM EDT

Everyone from legal scholars to patient privacy advocates — and even the White House — are saying the country’s landmark health privacy law is antiquated and needs to be updated.

But Congress doesn’t appear to be moving any legislation on the issue.

Backers of tougher health data privacy rules argue that much has changed in how people’s health information is collected and handled since the law governing patient records was passed in 1996. Protections added in 2009 don’t fully address the problem, they say.

The Health Insurance Portability and Accountability Act — commonly called HIPAA — largely applies to use of data by health care providers and insurance companies. But they are a smaller and smaller slice of who deals with patient information today.

For example, employee wellness programs, which are increasingly popular and hold potentially private information such as pregnancy status, don’t fall under the HIPAA umbrella. Hospital discharge data is sold by 33 states, according to the Federal Trade Commission, but only three do so in a HIPAA-compliant fashion.

“I think HIPAA does a really good job where it’s relevant,” said Kirk Nahra, a privacy and information security lawyer at Wiley Rein. “What’s happened in the last 15 years is that the space where it’s not relevant has been what’s growing.”

HIPAA governs the doctor-patient and doctor-payer relationships, but it didn’t envision the rest of the universe, and that’s where there is a need for new privacy protections, Nahra said.

Health and fitness apps — of which there are nearly 100,000 available today — are probably the biggest concern. They fall outside HIPAA and are free to collect and share information on their users.

The Privacy Rights Clearinghouse concluded last year that mobile health and fitness apps “are not particularly safe” when it comes to protecting user privacy. They found 26 percent of the free apps and 40 percent of paid apps didn’t have a privacy policy. Furthermore, 39 percent of free apps and 30 percent of paid apps sent data to a third party not disclosed by the developer.

The FTC mapped where data was being sent from 14 free health and fitness apps. One transmitted data to 18 different third parties with diet, workout, personal identifiers and other information. Fourteen third parties received consumers’ names and email addresses, and 22 received gender, location and symptom-search information.

The free use of consumer information by app makers is one reason privacy advocates are concerned that Apple is entering the game. The tech giant announced last week it would make its HealthKit part of its iOS 8 operating system, set to be released later this year.

The FTC sees all of this as a problem and is looking to Congress for help.

In a recent report on data brokers, the commission recommended Congress consider legislation to force tech companies to obtain express consent from consumers before information is collected or shared.

A White House report on big data and privacy last month noted that current policy “may not be well-suited” in the future. While health data exchanges will help realize technology’s potential, the information often is shared “in ways that might not accord with consumer expectations of the privacy of their medical data.”

“Health care leaders have voiced the need for a broader trust framework to grant all health information, regardless of its source, some level of privacy protection,” the report said.

Despite the pleas for new rules on use of consumer health information, Congress appears to be sitting on its hands. Little legislation exists, and the issue has yet to gain traction.

“The only thing that is likely to get congressional interest is for there to be a major data tragedy,” said Nicolas Terry, health law professor at Indiana University law school. “It’s very hard at the moment to see much consensus out there. Everyone says they believe in privacy. Privacy is very important. Privacy is a right. But actually moving the ball forward to protect consumers, given the massive weight of the information lobby, seems very hard.”

Congress has been working on data security and breach notification issues — especially in light of recent high-profile cases involving Target and others — with a decent chance of passing something by the end of the year.

Privacy is another issue. “There’s no consensus on broader privacy issues,” Nahra said.

Lawmakers on Capitol Hill have taken some steps to improve consumer privacy protections since HIPAA was passed. Seeing the dawn of the advent of electronic medical records, they included several provisions in the 2009 HITECH Act, including a ban on the sale of personal health information, breach notification requirements and penalties for privacy violators.

One possible source of inaction is the seemingly immovable lobbying force. Companies such as Microsoft, Google, Siemens, the Mayo Clinic, WebMD, IMS Health and IBM all spent money lobbying Congress last year on health privacy issues, according to disclosure forms.

Even Nike — maker of the popular fitness app Nike+ that’s implanted on all iPhones — disclosed lobbying on privacy issues in 2013.

Terry said consumers could incite change if they demanded it. Automobile makers lobbied hard against safety regulations in the 1960s and 1970s, but car safety is ubiquitous today because of pressure from car buyers, he said.

The FTC has the authority to halt companies’ deceptive practices if they fail to disclose certain data uses to consumers, notes Justin Brookman, director of consumer privacy at the Center for Democracy & Technology, which advocates stronger protections.

As long as the FTC and Congress remain inactive, and consumers remain passive, it’s up to Washington power brokers to point out HIPAA’s inadequacies.

“I do believe it’s time that we look beyond [HIPAA],” Karen DeSalvo, national coordinator for health IT, said at the recent Health Privacy Summit. “As this field rapidly evolves, we need to think about what additional protections might need to be in place.”

To view online:
https://www.politicopro.com/go/?id=35019

 

 

Risking OCR and Patient Ire, Many CEs Don’t Comply With Patient Access Rules

June 2014 Volume 14 Issue 6
aishealth.com

REPORT ON PATIENT PRIVACY delivers timely news and business strategies for safeguarding patient privacy and data security.

In apparent defiance of final HITECH regulations, many HIPAA covered entities (CEs) are not offering patients the option of receiving an electronic copy of their medical records, let alone in the “form and format” of their choosing, as has been required since January 2013.

Some are imposing fees for copies and applying limits on what they will provide that do not appear to be in line with regulations. Health systems with multiple hospitals have implemented the access requirements inconsistently across their medical centers, meaning some may be in compliance while others are not.

All of this is evident on the websites of covered entities, in their pages that outline the policies and procedures for patients to obtain their protected health information (PHI) — so officials from the Office for Civil Rights (OCR) can readily see it also. An OCR spokeswoman tells RPP “we can and we have” brought enforcement actions against CEs who violate the access requirements.

Patient advocates, medical records providers, privacy experts and others also tell RPP of a multitude of likely unlawful hoops imposed by CEs that people are jumping through to try to get their records.
“Unless you are behind the curtain like I am or unless you start finding the right stones to turn over, you don’t ever get to see the horror show that really exists in various degrees across the country,” says Chris Carpenter, director of operations for Diversified Medical Record Services, Inc. (DMRS), a business associate that processes records requests for hospitals and physicians offices nationwide.

To view the full article, please visit Risking OCR and Patient Ire, Many CEs Don’t Comply With Patient Access Rules

FTC Calls for Data Broker Transparency

By Marianne Kolbasuk McGee | healthcareinfosecurity.com
May 29, 2014

The Federal Trade Commission is urging Congress to enact privacy legislation that would provide consumers with more transparency about the activities of data brokers that collect sensitive health and financial data.

Reacting to the FTC recommendation, two consumer advocates say the explosion of data broker activities in recent years, coupled with regulatory gaps, point to the need for some legislative reforms to protect consumer privacy.

A May 27 FTC report that examined nine companies describes data brokers as “companies whose primary business is collecting personal information about consumers from a variety of sources and aggregating, analyzing and sharing that information, or information derived from it, for purposes such as marketing products, verifying an individual’s identity, or detecting fraud.”

The FTC says data brokers raise privacy concerns for consumers because “significantly, data brokers typically collect, maintain, manipulate and share a wide variety of information about consumers without interacting directly with them.”

The report notes: “In light of these findings, the commission unanimously renews its call for Congress to consider enacting legislation that would enable consumers to learn of the existence and activities of data brokers and provide consumers with reasonable access to information about them held by these entities.”

Deborah Peel, M.D., founder of advocacy group Patient Privacy Rights, says federal legislators and regulators need to crack down on data brokers, especially those that deal with sensitive information, such as health data.

“This is clearly a case where the government must pass laws that require personal control over personally identifiable information to restore our rights to privacy, because we can’t possibly do it ourselves,” Peel says. “Worse, the FTC seems not to have a handle on the size of the health data broker industry. … “Personal information is the ‘oil’ of the digital age – and our personal information belongs to each of us. … If the data brokers want our data, they should just ask. If we think the benefits are worth it, we will say ‘yes’.”

To view the full article, please visit FTC Calls for Data Broker Transparency

 

What You Need to Know About Patient Matching and Your Privacy and What You Can Do About It

Today, ONC released a report on patient matching practices and to the casual reader it will look like a byzantine subject. It’s not.

You should care about patient matching, and you will.

It impacts your ability to coordinate care, purchase life and disability insurance, and maybe even your job. Through ID theft, it also impacts your safety and security. Patient matching’s most significant impact, however, could be to your pocketbook as it’s being used to fix prices and reduce competition in a high deductible insurance system that makes families subject up to $12,700 of out-of-pocket expenses every year.

Patient matching is the healthcare cousin of NSA surveillance.

Health IT’s watershed is when people finally realize that hospital privacy and security practices are unfair and we begin to demand consent, data minimization and transparency for our most intimate information. The practices suggested by Patient Privacy Rights are relatively simple and obvious and will be discussed toward the end of this article.

Health IT tries to be different from other IT sectors. There are many reasons for this, few of them are good reasons. Health IT practices are dictated by HIPAA, where the rest of IT is either FTC or the Fair Credit Reporting Act. Healthcare is mostly paid by third-party insurance and so the risks of fraud are different than in traditional markets.

Healthcare is delivered by strictly licensed professionals regulated differently than the institutions that purchase the Health IT. These are the major reasons for healthcare IT exceptionalism but they are not a good excuse for bad privacy and security practices, so this is about to change.

Health IT privacy and security are in tatters, and nowhere is it more evident than the “patient matching” discussion. Although HIPAA has some significant security features, it also eliminated a patient’s right to consent and Fair Information Practice.

Patient matching by all sorts of health information aggregators and health information exchanges is involuntary and hidden from the patient as much as NSA surveillance is.

Patients don’t have any idea of how many databases are tracking our every healthcare action. We have no equivalent to the Fair Credit Reporting Act to cover these database operators. The databases are both public and private. The public ones are called Health Information Exchanges, All Payer Claims Databases, Prescription Drug Monitoring Programs, Mental Health Registries, Medicaid, and more.

The private ones are called “analytics” and sell $Billions of our aggregated data to hospitals eager to improve their margins, if not their mission.

The ONC report overlooks the obvious issue of FAIRNESS to the patient. The core of Fair Information Practice are Consent, Minimization and Transparency. The current report ignores all of these issues:

- Consent is not asked. By definition, patient matching is required for information sharing. Patient matching without patient consent leads to sharing of PHI without patient consent. The Consent form that is being used to authorize patient matching must list the actual parameters that will be used for the match. Today’s generic Notice of Privacy Practices are as inadequate as signing a blank check.

- Data is not minimized. Citizen matching outside of the health sector is usually based on a unique and well understood identifier such as a phone number, email, or SSN. To the extent that the report does not allow patients to specify their own matching criterion, a lot of extra private data is being shared for patient matching purposes. This violates data minimization.

- Transparency is absent. The patient is not notified when they are matched. This violates the most basic principles of error management and security. In banking or online services, it is routine to get a simple email or a call when a security-sensitive transaction is made.

This must be required of all patient matching in healthcare. In addition, patients are not given access to the matching database. This elementary degree of transparency for credit bureaus that match citizens is law under the Fair Credit Reporting Act and should be at least as strict in health care.

These elementary features of any EHR and any exchange are the watershed defining patient-centered health IT. If a sense of privacy and trust don’t push our service providers to treat patients as first-class users, then the global need for improved cybersecurity will have to drive the shift. Healthcare is critical infrastructure just as much as food and energy.

But what can you, as a patient. do to hasten your emancipation? I would start with this simple checklist:

Opt-out of sharing your health records unless the system offers:

  • Direct secure messaging with patients
  • Plain email or text notification of records matching
  • Patient-specified Direct email as match criterion
  • Your specific matching identifiers displayed on all consent forms
  • Online patient access to matchers and other aggregator databases

None of these five requirements are too hard. Google, Apple and your bank have done all of these things for years. The time has come for healthcare to follow suit.

Adrian Gropper, MD is Chief Technical Officer of Patient Privacy Rights and participates in Blue Button+, Direct secure messaging governance efforts and the evolution of patient-directed health information exchange.

Check out the Latest from Dr. Gropper, courtesy of The Healthcare Blog.

Did Tim Armstrong’s ‘Distressed Babies’ Comment Violate HIPAA Privacy Laws?

US citizens have a fundamental Constitutional right to health information privacy—but can’t easily sue. Only federal employees can sue under the Privacy Act of 1974, as vets did when a laptop with millions of health records was stolen. Even with strong state health privacy laws and state constitutional rights to privacy in place, it’s very hard to sue because most courts demand proof of monetary harm. This new digital disaster: exposing and/or selling sensitive personal health data–can’t be stopped without stronger, clearer federal laws. OR if US citizens boycott the corporations that violate their rights to health privacy.

-Deb

This blog written in response to the following article:

Did Tim Armstrong’s ‘Distressed Babies’ Comment Violate HIPAA Privacy Laws?
By Abby Ohlheiser
The Wire, February 10, 2014

Revelations by AOL Boss Raise Fears Over Privacy

By Natasha Singer
NYTimes.com, February 10, 2014

Tim Armstrong, the chief executive of AOL, apologized last weekend for publicly revealing sensitive health care details about two employees to explain why the online media giant had decided to cut benefits. He even reinstated the benefits after a backlash.

Tim Armstrong, the chief executive of AOL, apologized last weekend for publicly revealing sensitive health care details about two employees to explain why the online media giant had decided to cut benefits. He even reinstated the benefits after a backlash.

But patient and work force experts say the gaffe could have a lasting impact on how comfortable — or discomfited — Americans feel about bosses’ data-mining their personal lives.

Mr. Armstrong made a seemingly offhand reference to “two AOL-ers that had distressed babies that were born that we paid a million dollars each to make sure those babies were O.K.” The comments, made in a conference call with employees, brought an immediate outcry, raising questions over corporate access to and handling of employees’ personal medical data.

“This example shows how easy it is for employers to find out if employees have a rare medical condition,” said Dr. Deborah C. Peel, founder of Patient Privacy Rights, a nonprofit group in Austin, Tex. She urged regulators to investigate Mr. Armstrong’s disclosure about the babies, saying “he completely outed these two families.”

To view the full article, please visit Revelations by AOL Boss Raise Fears Over Privacy

 

Guest Blog – The AOL Babies: Our Healthcare Crisis in a Nut

Check out the latest from Nic Terry, courtesy of HealthLawProf Blog.

Where does one start with AOL CEO Armstrong’s ridiculous and unfeeling justifications for changes in his company’s 401(k) plan. Cable TV and Twitter came out of the blocks fast with the obvious critiques. And the outrage only increased after novelist Deanna Fei took to Slate to identify her daughter as one of the subjects of Armstrong’s implied criticism. Armstrong has now apologized and reversed his earlier decision.

As the corporate spin doctors contain the damage, Armstrong’s statements likely will recede from memory, although I am still hoping The Onion will memorialize Armstrong’s entry into the healthcare debate (suggested headline, “CEO Discovers Nation’s Healthcare Crisis Caused by 25 Ounce Baby”). But supposing (just supposing) your health law students ask about the story in class this week. What sort of journey can you take them on?

First (but only if you are feeling particularly mean), you could start with HIPAA privacy. After all, intuitively it seemed strange to hear an employer publicly describing the serious health problems of employees’ family members. With luck your students will volunteer that the HIPAA Privacy Rule does not apply to employers (not “covered entities”). True, but AOL provided employees and their families with a health plan. Assume this was an employer-sponsored plan of some scale. It remains the case that the plan and not the employer is subject to the Privacy Rule, although following the Omnibus rule, the plan and its business associates are going to face increased regulation (such as breach notification, new privacy notices, etc). The employer’s responsibilities are to be found at 45 CFR 164.504 and primarily 164.504(f) (and here we descend deep into the HIPAA weeds). The employer must ensure that the plan sets out the plan members’ privacy rights viz-a-viz the employer. For plans like these the employer can be passed somewhat deindentied summary information (though for very limited purposes that don’t seem to include TV appearances). However, if the employer essentially administers the plan then things get more complicated. Firewalls are required between different groups of employees and employer-use of PHI is severely limited. By the way, and in fairness to Mr Armstrong, there are many things we don’t know about the AOL health plan, the source of his information about the “distressed babies,” whether any PHI had been deidentified, etc. Yet, at the very least AOL may have opened themselves up to the OCR asking similar questions and starting an investigation into how AOL treats enrollee information.

Second, this storm about the babies’ health insurance should provide a good basis for discussion of the various types of health insurance and their differential treatment by the Affordable Care Act. A large company likely will offer either a fully-insured or self-insured plan to its employees. If the latter, would your students have recommended reinsurance against claim “spikes” with a stop-loss policy? ACA should have relatively little impact on such plans or their cost except where the plans fall beneath the essential benefits floor. Contrast such plans with those traditionally offered on the individual market that are now being replaced with the lower cost (subject again to extra costs associated with essential benefits) health exchange-offered plans.

Third, this entire episode raises the question of health care costs and, specifically, the pricing of health care. On first hearing a million dollar price tag seems extraordinary. Yet as Ms. Fei noted in her Slate article, her daughter spent three months in a neonatal ICU and endured innumerable procedures and tests resulting in “a 3-inch thick folder of hospital bills that range from a few dollars and cents to the high six figures.” Now, the ACA may be criticized for not doing enough to cut costs (how about a quick pop quiz on what it does try to do?), but is there any truth to the argument that it raises health care costs? Recent investigative work by Steve Brill and fine scholarship by Erin Fuse Brown have highlighted both high prices and high differential pricing in health care. So why would a corporate executive (either directly or indirectly) blame high prices on the ACA? Are, for example, technology markets so different that the reasons for health care costs are under appreciated? And by extension, instead of fighting the ACA why are corporate CEOs not urging a second round of legislation aimed specifically at reducing the cost of healthcare for all? After all it is highly unlikley FFS pricing would be tolerated in their non-health domains. Or does such a group prefer the status quo and what Beatrix Hoffman critically terms as rationing by price?

My Baby and AOL’s Bottom Line

By Deanna Fei
Slate Magazine, February 9, 2014

That “distressed baby” who Tim Armstrong blamed for benefit cuts? She’s my daughter.

Late last week, Tim Armstrong, the chief executive officer of AOL, landed himself in a media firestorm when he held a town hall with employees to explain why he was paring their retirement benefits. After initially blaming Obamacare for driving up the company’s health care costs, he pointed the finger at an unlikely target: babies.

Specifically, my baby.

“Two things that happened in 2012,” Armstrong said. “We had two AOL-ers that had distressed babies that were born that we paid a million dollars each to make sure those babies were OK in general. And those are the things that add up into our benefits cost. So when we had the final decision about what benefits to cut because of the increased healthcare costs, we made the decision, and I made the decision, to basically change the 401(k) plan.”

Within hours, that quote was all over the Internet. On Friday, Armstrong’s logic was the subject of lengthy discussions on CNN, MSNBC, and other outlets. Mothers’ advocates scolded him for gross insensitivity. Lawyers debated whether he had violated his employees’ privacy. Health care experts noted that his accounting of these “million-dollar babies” seemed, at best, fuzzy.

Plenty of smart, witty people took to Twitter to express their outrage—or mock outrage. The phrase “distressed babies” became practically an inside joke, as in, “How many distressed babies does AOL pay this guy?” A few AOL employees made cracks like this: “I swear I didn’t have any babies in 2012. Don’t hate me for messing up your 401(k).”

To view the full article, please visit My Baby and AOL’s Bottom Line

New CLIA rule talks the talk, but it doesn’t walk the walk

Deborah Peel, MD, Founder and Chair of Patient Privacy Rights

The federal government released an update to the CLIA rule this week that will require all labs to send test results directly to patients. But the regulations fail to achieve the stated intent to help patients. The rule allows labs to delay patient access to test results up to 30 days, and the process for directly obtaining personal test results from labs is not automated.

The new rule also fails to help patients in significant ways:

  • Real-time, online test results are not required. The federal government should have required all labs to use technology that benefits patients by enabling easy, automatic access to test results via the Internet in real-time. Unless we can obtain real-time access to test results, we can’t get a timely second opinion or verify the appropriate tests were ordered at the right time for our symptoms and diseases.
  • Labs are allowed to charge fees for providing test results to patients.  If labs can charge fees, they will not automate the process for patients to obtain results. Labs that automate patient access to test results online would incur a one-time cost.  After labs automate the process, human ‘work’ or time is no longer needed to provide patients their test results, so the labs would have no ongoing costs to recoup from patients.
  • Labs should be banned from selling, sharing, or disclosing patient test results without meaningful informed consent to anyone, except the physician who ordered the tests. This unfair and deceptive trade practice should be stopped. No patient expects labs to sell or share their test results with any other person or company except the physician who ordered the test(s).

This rule raises a question: why do so many federal rules for improving the healthcare system fail to require technologies that benefit patients?

Technology could provide enormous benefits to patients, but the US government caters to the healthcare and technology industries, instead of protecting patients.

Current US health IT systems actually facilitate the exploitation of patients’ records via technology. When HHS eliminated patient control over personal health data from HIPAA in 2002, it created a massive hidden US data broker industry that sells, shares , aggregates and discloses longitudinal patient profiles (for an example, see IMS’ SEC filing with details about selling 400M longitudinal patient profiles to 5K clients, including the U.S. government.

Meanwhile, even the most mundane, annoying, repetitive tasks patients must perform today–like filling out new paper forms with personal information every time we visit a doctor–are not automated for our convenience or to improve data quality and accuracy.

Shouldn’t IT improve patients’ experiences, treatment, and restore personal control over sensitive health information?

deb

You can also view a copy of this blog post here

WPF Report — Paying out of Pocket to Protect Health Privacy: A New but Complicated HIPAA Option; A Report on the HIPAA Right to Restrict Disclosure

San Diego & Washington, D.C. — The World Privacy Forum published a new report today that helps patients understand and use the new HIPAA right to restrict disclosure of their medical information to health plans when treatment is paid for out of pocket in full. The report contains practical advice and tips for patients about how to navigate the new right, which went into effect last year. Paying Out of Pocket to Protect Health Privacy: A New But complicated HIPAA Option; A Report on the HIPAA Right to Restrict Disclosure is one of the first reports on this topic written for patients.

“The new HIPAA right that lets patients restrict disclosures of their health information is actually not well known yet, and that needs to change,” said Pam Dixon, Executive Director of the World Privacy Forum. “This report has specific, concrete tips and information that will help patients use this important new right.” The report, written by Bob Gellman and Pam Dixon is available free of charge at www.worldprivacyforum.org.

Key points:

  • A patient has the right to prevent a health care provider from reporting information to a health insurer if the patient pays in full.
  • In order to prevent disclosure of information to a health plan, patients must make a Request to Restrict Disclosure.
  • Under the new changes to HIPAA, a patient has the firm right to demand, not just request, that a provider not disclose PHI to a health plan when certain conditions are met.
  • The conditions to be met can be complex, and work best with some advance planning.

Additional tips are in the report.

The bipartisan Coalition for Patient Privacy worked to get this key consumer protection into HITECH.

Bob Gellman and Pam Dixon are available to discuss tips and advice for patients on how to use the new HIPAA right.

Links:

The report Paying Out of Pocket to Protect Health Privacy: A New But complicated HIPAA Option; A Report on the HIPAA Right to Restrict Disclosure is available in PDF or in text.

Permalink: http://www.worldprivacyforum.org/2014/01/wpf-report-paying-out-of-pocket-to-protect-health-privacy/

Contact:

Bob Gellman 202-543-7023

Pam Dixon 760-712-4281

info@worldprivacyforum.org