Re: PNAS study on predicting human behavior using digital records

Picture a box with 2,000 or 10,000 puzzle pieces inside—any one puzzle piece reveals nothing about the picture. But when all the pieces are assembled, an incredibly detailed picture FULL of information is created.

The data mining industry—including Google, Facebook, Acxiom and thousands more unknown corporations and foreign businesses—assembles the puzzle of who we are from thousands of bits of data we leave online. They know FAR MORE than anyone on Earth knows about each of us—more than what our partners, our moms and dads, our best friends, our psychoanalysts, or our children know about us.

The UK study shows how easy it is for hidden data mining companies to intimately know us (and sell) WHO WE ARE.

Most Americans are not aware of the ‘surveillance economy’ or that data miners can easily collect intimate psychological and physical/health profiles of everyone from online data.

The study:

  • “demonstrates the degree to which relatively basic digital records of human behavior can be used to automatically and accurately estimate a wide range of personal attributes that people would typically assume to be private”
  • “is based on Facebook Likes, a mechanism used by Facebook users to express their positive association with (or “Like”) online content, such as photos, friends’ status updates, Facebook pages of products, sports, musicians, books, restaurants, or popular Web sites”
  • correctly discriminates between:
    • homosexual and heterosexual men in 88% of cases
    • African Americans and Caucasian Americans in 95% of cases
    • between Democrat and Republican in 85% of cases
    • For the personality trait “Openness,” prediction accuracy is close to the test–retest accuracy of a standard personality test

The “surveillance economy” is why the US needs FAR STRONGER LAWS at the very least to prevent the hidden collection, use, and sale of health data, including everything about our minds and bodies, unless we give meaningful informed consent.

This urgent topic, ie whether the US should adopt strong data privacy and security protections like the EU—will be debated at the 3rd International Summit on the Future of Health Privacy June 5-6 in DC (it’s free to attend and will also be live-streamed). Register at: www.healthprivacysummit.org

theDataMap™

theDataMap™ is an online portal for documenting flows of personal data. The goal is to produce a detailed description of personal data flows in the United States.

A comprehensive data map will encourage new uses of personal data, help innovators find new data sources, and educate the public and inform policy makers on data sharing practices so society can act responsibly to reap benefits from sharing while addressing risks for harm. To accomplish this goal, the portal engages members of the public in a game-like environment to report and vet reports of personal data sharing. More…

Members of the public sign-up to be Data Detectives and then work with other Data Detectives to report and vet data sharing arrangements found on the Internet. Data Detectives are responsible for content on theDataMap™.

See the debut of theDataMap™ from the “Celebration of Privacy” during the 2nd International Summit on the Future of Health Privacy here:

DNA records pose new privacy risks

To view the full article, please visit: DNA Records Pose New Privacy Risks

An article in the Boston Globe highlights the ease with which DNA records can be re-identified. According to the article, “Scientists at the Whitehead Institute for Biomedical Research showed how easily this sensitive health information could be ­revealed and possibly fall into the wrong hands. Identifying the supposedly anonymous research participants did not require fancy tools or expensive equipment: It took a single researcher with an Internet connection about three to seven hours per person.” Even truly anonymous data was not entirely safe from being re-identified. Yaniv Erlich”…decided to extend the technique to see if it would work with truly anonymous ­data. He began with 10 unidentified men whose DNA ­sequences had been analyzed and posted online as part of the federally funded 1,000 Genomes Project. The men were also part of a separate scientific study in which their family members had provided genetic samples. The samples and the donors’ relationships to one ­another were listed on a website and publicly available from a tissue repository.”

These findings are incredibly relevant because it is highly possible that “something a single researcher did in three to seven hours could easily be automated and used by companies or insurers to make predictions about a person’s risk for disease. ­Although the federal Genetic Information Nondiscrimination Act protects DNA from ­being used by health insurers and employers to discriminate against people”.

Privacy and Data Management on Mobile Devices

See this link for the entire survey of 1,954 cell phone users (see excerpt below): http://pewinternet.org/~/media//Files/Reports/2012/PIP_MobilePrivacyManagement.pdf

When the public learns about hidden data use and collection on cell phones,  significant numbers of people TURN them OFF:

  • -“57% of all app users have either uninstalled an app over concerns about having to share their personal information, or declined to install an app in the first place”

What will the public do when they realize they CANNOT turn off:

  • -hundreds of software ‘applications’ at hospitals that collect, use, and sell their health information
  • -thousands of EHRs and other health information technologies that collect, use, and sell their health information
  • -health-related websites that collect, use, and sell their health information

Patient Trust in Confidentiality Affects Health Decisions

To view the full article by Pablo Valerio, please visit Enterprise Efficiency: Patient Trust in Confidentiality Affects Health Decisions

This article highlights a survey sponsored by FairWarning that looks at how “patient privacy considerations impact the actual delivery of healthcare” in the UK and US.

Key quotes from the story:

-”CIOs and healthcare providers need to ensure the best security, not only because it is the law, but because data breaches actually affect how honest a patient might be with a doctor and how quickly they will seek medical attention.”

-”It is not enough to comply with government regulations about data protection. If a data breach occurs patients are not going to check if the institution was following rules, they are going to blame their executives for allowing the breach to happen, regardless of the reasons.”

The survey, “UK: How Privacy Considerations Drive Patient Decisions and Impact Patient Care Outcomes; Trust in the confidentiality of medical records influences when, where, who and what kind of medical treatment is delivered to patients” cited in the article below compares attitudes about health information privacy in the UK and US.

Some key UK findings are:

-38.3 percent stated they have or would postpone seeking care for a sensitive medical condition due to privacy concerns

-More than half of patients stated that if they had a sensitive medical condition, they would withhold information from their care provider.

-Nearly 2 out of 5 stated they would postpone seeking care out of privacy concerns.

-45.1 percent would seek care outside of their community due to privacy concerns

-37 percent would travel… 30 miles or more, to avoid being treated at a hospital they did not trust

US vs UK patients:

-UK patients are almost twice as likely to withhold information from their care provider…if they had a poor record of protecting patient privacy.

-4 out of 10 UK patients versus nearly 3 out of 10 US patients … would put off seeking care … due to privacy concerns.

-97 percent of UK and US patients stated chief executives and healthcare providers have a legal and ethical responsibility to protect patients’ medical records from being breached.

Promising research may protect health records privacy

To view the full article in Modern Healthcare, please visit Promising research may protect health records privacy.

A recent article in ModernHealthcare.com explains a new and promising technology developed by the Wake Forest School of Medicine’s Department of Biomedical Engineering. They have developed a “prototype health information exchange that both works for providers and restores patient control over the flow of their medical images.” The article explains how the new exchange utilizes “what’s called a Patient Controlled Access-key Registry to manage access for both patients and providers. A patient, who would allow another provider to see his or her records, releases an ‘access key’ with a digital signature at a patient portal.”

The article also quotes Dr. Peel’s views on the new system: “Psychiatrist and patient privacy advocate Dr. Deborah Peel— often a critic of health IT systems that she sees compromising privacy— says she likes what she reads about the Wake Forest pilot. ‘The majority of current HIT systems and data exchanges violate medical ethics and patients’ long-standing rights to control PHI (protected health information,’ Peel wrote in an email Wednesday. ‘Bravo to the Wake Forest research team for finally building effective electronic patient consent tools. Yes, this model solves the legal problems of data sharing. And yes, it builds patient trust in physicians because it restores the personal control over use and disclosure of protected health information that patients expect.’”

Patient Control Reduces Privacy Issues for Health Data Sharing Networks

See the full article on iHealthBeat.org: Patient Control Reduces Privacy Issues for Health Data Sharing Networks

It’s about time!!!! Congratulations to Wake Forest for building a way to move data that patients can trust. Patients have waited a long time for systems to be built that enable them to move their own information.

YES, this model solves the legal problems of data sharing—there is no need for expensive contracts between hospitals and doctors.  And YES, it builds patient trust in physicians because it restores the personal control over use and disclosure of protected health information (PHI) that patients EXPECT.

The majority of current HIT systems and data exchanges violate medical ethics and patients’ long-standing rights to control PHI. This kind of electronic consent is THE ONLY way patient data should flow.

BRAVO to the Wake Forest research team for finally building effective electronic patient consent tools.

Only 26 Percent of Americans Want Electronic Medical Records, Says Xerox Survey

Xerox kindly shared all three years of their annual Electronic Health Records (EHR) online surveys by Harris Interactive. The media, industry and government unrelentingly promote health technology as the latest, greatest best stuff.  But the public ain’t buying it.  They want smart phones, but they don’t  want EHRs.

Clearly the public is not very excited about EHRs; 74% don’t want them. They don’t want them because they understand the problems with EHRs so well.

To view the article, please visit Only 26 Percent of Americans Want Electronic Medical Records, Says Xerox survey

Not only do the surveys show a low percentage of Americans want electronic health records—but it’s remained low; this year at only 26%. Overall 85% of the public has “concerns” about EHRs this year. The surveys also asked about specific ‘concerns’. They found the public is concerned that health data security is poor, data can be lost or corrupted, records can be misused, and that outages or ‘computer problems’ can take records offline and compromise care.  See results below:

To the question do you want your medical records to be digital:

  • 26% said ‘yes’ in 2010
  • 28% said ‘yes’ in 2011
  • 26% said ‘yes’ in 2012

To the question do you have concerns about digital records:

  • 82% said ‘yes’ in 2010
  • 83% said ‘yes’ in 2011
  • 85% said ‘yes’ in 2012

To the question could your information be hacked:

  • 64%  said ‘yes’ in 2010
  • 65%  said ‘yes’ in 2011
  • 63%  said ‘yes’ in 2012

To the question could your digital medical records  be lost or corrupted:

  • 55% said ‘yes’ in 2010
  • 54% said ‘yes’ in 2011
  • 50% said ‘yes’ in 2012

To the question could your personal information be misused:

  • 57% said ‘yes’ in 2010
  • 52% said ‘yes’ in 2011
  • 51% said ‘yes’ in 2012

To the question could a power outage or computer problem prevent doctors from accessing my information:

  • 52% said ‘yes’ in 2010
  • 52% said ‘yes’ in 2011
  • 50% said ‘yes’ in 2012

Patient Safety and Health Information Technology: Learning from Our Mistakes

MUST READ article by Ross Koppel about why and how government and industry denial of serious design flaws in electronic health systems endanger patients’ lives and safety. He uses detailed examples, citations, and the historical record to support his case. Flawed technology causes serious patient safety issues in the same way flawed technology prevents patient control over who can see, use, or sell sensitive health information.

Yet technology could vastly improve patient safety and put patients back in control over the use of their health data. Why is poor technology design entrenched and systemic? Koppel states, “The essential question is: why has the promise of health IT—now 40 years old—not been achieved despite the hundreds of billions of dollars the US government and providers have spent on it?”

He makes the case that key problems arise from industry domination over the public interest. “Marketing overdrive” has caused:
· Denial and magical thinking: we see the “systematic refusal to acknowledge health IT’s problems, and, most important, to learn from them”

· Prevention of “meaningful regulations since 1997″: “This belief that health IT, by itself, improves care and reduces costs has not only diminished government responsibility to set data format standards, it has also caused us to set aside concerns of usability, interoperability, patient safety, and data integrity (keeping data accountable and reliable).”

· Destructive “lock-in” to flawed technology systems: A full software package from a top firm for a large hospital costs over $180 million, and can cost five times that figure for implementation, training, configuration, cross-covering of staff, and so on.(11,12) Because illness, accidents, and pregnancies cannot be scheduled around health IT training and implementation needs, the hospital must continue to operate while its core information systems are developed and installed. This investment of time and money means the hospital is committed for a decade or more. It also reduces incentives for health IT vendors to be responsive to the needs of current customers.(13,14)

We have been to this rodeo before. Koppel points out these same phenomena occur over and over in many other industries:
“we had dozens of railroad gauges, hundreds of time zones, and even areas with both left- and right-hand driving rules. In all cases, the federal government established standards, and the people, the economy, and especially the resistant industries flourished. Industry claims that such standards would restrict innovation were turned on their heads.”

The health technology industry has failed to reform itself for 40 years. Effective federal laws and regulation are the only path to ensuring innovation and interoperability, to make health IT systems safe for patients and useful to doctors, and to restore individual control over who sees the most sensitive personal information on Earth.

See the full article at Web M&M: Patient Safety and Health Information Technology: Learning from Our Mistakes

Can Privacy & Electronic Medical Records Coexist? — Quotes PPR

An article written at Pacific Standard discusses the struggle to maintain patient privacy when electronic health records are becoming the norm. To view the full article, please visit Can Privacy & Electronic Medical Records Coexist?.

A few key quotes from the story:

“…researchers have to figure out how to digitize some of your most sensitive personal information to make it easily accessible to you and your doctors without compromising your privacy before the many other parties who might also like to peek at this data. Researchers lament that it’s currently impossible to track all of the places your digital medical information travels once you leave the doctor’s office. Certainly, pieces of it are shared with your doctor’s office, your doctor’s hospital, your insurance company, your pharmacist and the pharmaceutical company that makes your medicine. Your personal information may also be anonymized and aggregated with other patients to produce data sets used by researchers or traded on the commercial market.”

“Researchers and industry innovators gunning for that 2014 deadline have to figure out how to set all of this information free — when it comes to maximizing the benefit to you as a patient — while, on the other hand, keeping it under some kind of control. And it’s not entirely clear how that architecture might look.”

“‘My big fear is that if we don’t build these systems right, people won’t see doctors,’ said Deborah Peel, the executive director of Patient Privacy Rights and the moderator of the conference discussion.”