Patient Privacy Rights

About HIPAA Summit Media Contact DONATE

How did you grade these PHRs?
Why should you worry if a company is "HIPAA compliant"?
Will you allow vendors to respond to these grades?
Have you received any money from these vendors?
Should you use a PHR?
Why is “anonymous”, “de-identified”, or “aggregate” data a problem?
Glossary

How did you grade these PHRs?

We first established the criteria based on the Code of Fair Information Practices, privacy rights protected by the Hippocratic Oath, the U.S. Constitution, common law and state and federal law, and the Privacy Principles developed by the Coalition for Patient Privacy.  We described what we believe are the most meaningful privacy controls individuals should have over health information in electronic health systems.

Next, we described what each letter grade means (A = excellent, C = bare minimum).  For consistency, we gave each letter grade a numerical value: A = 5, B = 4, C = 3, D = 2, F = 1. Finally, we selected PHRs to evaluate and examined their privacy policies.  We viewed the primary privacy policy, any other documents or policies referenced and we signed up for a PHR to see how it functioned.  Only one PHR required payment (CapMed icePHR).

PPR is staffed by consumer advocates, not lawyers.  We did not employ attorneys to legally interpret the policies. Once the reviews were complete, we tallied the total points for each PHR and divided that total by 10, the number of categories.  All categories were weighted equally.

Two grades were given to Google Health and Microsoft HealthVault, products we refer to as “Platforms.”  Google Health and Microsoft HealthVault’s privacy policies apply only to their Platform, not to any of the companies linked to their Platform.  For example, while the Platform, may require the individual’s consent before disclosing any data; any third party such as another PHR, a tracking tool for diabetes or research search engine does not necessarily play by the same rules.

One grade was given to the Platform itself and another grade was given to the programs and partner applications linked to the Platform to highlight the differences between the applicable policies.  The programs and partner applications for each Platform were treated as one group.  There are simply far too many different programs/partners for PPR to grade each individually. As such, we took a random sampling of these programs/partners.  The grade for these groups of companies (an “F” for both Google Health partners and Microsoft HealthVault programs) does not mean that all of the third party companies failed.  Rather some of the companies randomly selected scored poorly because they do not allow meaningful patient control over their information.  Note that NoMoreClipboard.com is a PHR available on both platforms and it earned an “A”.

We also note that if the Program or Partner application is “HIPAA compliant” it can use any information provided from your account for “treatment, payment and health care operations” without getting your express consent.  This does not give the individual control over their private, sensitive information.  Most people have no idea how broad those three categories actually are.

Why should you worry if a company is "HIPAA compliant"?

HIPAA grants corporations the right to use your personally identifiable information without your knowledge or permission for purposes of "Treatment, Payment, & Health Care Operations."  These categories are broad and apply to companies you don't even know are involved with your health care.  Learn More About Health Privacy & "HIPAA: Intent vs. Reality"  Top

Will you allow vendors to respond to these grades?

We sent the vendors their Report Card in advance.  We welcomed vendors’ responses to the report card, and will post vendors’ comments. We want to open or continue a dialogue with all vendors interested in protecting privacy.  Each report card contains detailed comments for each category.  We reserve the right to respond to all vendor comments we post. Top

Have you received any money from these vendors?

PPR conducted privacy training for Microsoft in 2009 and was paid for those services.  Privacy trainings are available to any entity interested in developing a corporate culture of privacy.  We do not provide paid consulting to any vendors, though we happily provide feedback.  We are not paid for any feedback.  We also note that Microsoft signed on to the Coalition for Patient Privacy Principles in 2007 and 2009. Top

Should you use a PHR?

Any PHR that shares any information, identifiable or “de-identifiable/ aggregate/anonymous” data with employers, insurers, etc. is risky. Assume your PHR does not give you control over your health information until you affirmatively confirm otherwise.  Be selective about any information you provide.  For example, if you want to track lifestyle information that a doctor, insurer or employer wouldn’t normally have, you may want to use an alias when you set up that account.  Some PHRs let you open an account under an alias or your dog’s name, but a fake name alone will not necessarily make your data safe, because the PHR could use other public online information about you to re-identify your health records.

PPR is not recommending the use of any of the PHRs we reviewed, regardless of the grade earned.  We do not guarantee any of the information provided.  This is simply a guide, prepared by consumer advocates, to raise awareness and share an educated opinion. Top

Why is “anonymous”, “de-identified”, or “aggregate” data a problem?

It is practically impossible to ensure that anonymous/de-identified/aggregate data cannot be re-identified; far too much information exists and is accessible now to the average person.  Dr. Latanya Sweeney showed she can re-identify 87% of the population with just gender, month and date of birth and zip code.

Data is either useful or anonymous, but never both.  Learn More.  Data may seem anonymous but when coupled with another set of data, the merged data set can often reveal identity.  Consider data an employer or insurer already has on you, overlapped with “anonymous” data such as age, location, gender and dates of absence for a report on those who searched for “cancer testing.”  If employers and insurers want to identify sick or expensive people, they can. Top