Privacy and Data Management on Mobile Devices

See this link for the entire survey of 1,954 cell phone users (see excerpt below): http://pewinternet.org/~/media//Files/Reports/2012/PIP_MobilePrivacyManagement.pdf

When the public learns about hidden data use and collection on cell phones,  significant numbers of people TURN them OFF:

  • -“57% of all app users have either uninstalled an app over concerns about having to share their personal information, or declined to install an app in the first place”

What will the public do when they realize they CANNOT turn off:

  • -hundreds of software ‘applications’ at hospitals that collect, use, and sell their health information
  • -thousands of EHRs and other health information technologies that collect, use, and sell their health information
  • -health-related websites that collect, use, and sell their health information

Attackers Demand Ransom After Encrypting Medical Center’s Server

To view the full article by John E. Dunn, please visit CIO: Attackers Demand Ransom After Encrypting Medical Center’s Server

What happens to patients when their doctors can’t get their records because thieves encrypted them? Federal law has required strong health data security protections since 2002, but 80% of hospitals and practices don’t encrypt patient data. If The Surgeons of Lake County had been following the law and encrypted their records, this attack could not have happened.

How a Lone Grad Student Scooped the Government and What It Means for Your Online Privacy

See the full article at ProPublica.org: How a Lone Grad Student Scooped the Government and What It Means for Your Online Privacy

Sobering.  Silicon Valley decides what privacy rights we have online, in clouds, in electronic health systems, in apps, on social media, and on mobile devices. Our fundamental Constitutional rights to privacy—to control personal information about our lives, minds, and bodies—is defended by lone grad students, European Data Commissioners, a few small privacy advocacy organizations, the FTC, and a handful of whistleblowers.

A PREDICTION: Selling intimate cyber-profiles will end when the public discovers that NOTHING about their minds and bodies is private.

The lack of control over sensitive health data will be the nation’s wake-up call to rein in Silicon Valley and restore the right to be ‘let alone’. See: Olmstead v. United States, 277 U.S. 438, 478, 48 S.Ct. 564, 572 (1928) (Brandeis J., dissenting).

  • Cyber-profiles of our minds and bodies contain far more sensitive information than mothers, lovers, friends, Rorschach tests, or psychoanalysts could ever reveal.
  • “If you are not paying for it, you’re not the customer; you’re the product being sold”, see Andrew Lewis at: http://www.metafilter.com/user/15556.
  • 35-40% of us are “Health Privacy Intense”—-a very large minority; see Westin’s keynote slides from the 1st International Summit on the Future of Health Privacy:http://tiny.cc/9alvgw

THE TIPPING POINT will be when the public discovers that electronic health systems facilitate cyber-theft, data mining, data sales, ‘research’ without consent, and allow thousands of strangers to snoop in millions of patient records (think George Clooney and more: http://www.foxnews.com/story/0,2933,348988,00.html).

Health data is the most sensitive personal information on Earth. Everything from prescription records to DNA to diagnoses are HOT BUTTONS.

Instead of enabling patients to decide which physicians or researchers they want to see their health records, corporate and government data holders decide who can use and sell Americans’ sensitive health data—-upending centuries of law and ethics based on the Hippocratic Oath, which requires physicians to ask consent before disclosing any information.

Targeted attacks cost companies an average of $200k

See the full article at SC Magazine: Targeted attacks cost companies an average of $200k

It always costs more to repair than to prevent. The curious thing is that federal law mandated basic security protections in HIPAA, but industry never bothered because the law was never enforced.

Here we are 12 years after the HIPAA Privacy Rule was implemented:

· the Coalition for Patient Privacy got MUCH tougher security rules and enforcement into HITECH

· breaches are rampant

· 80% of hospitals still don’t encrypt data

What’s wrong with this picture? Register for the 2nd International Summit on the Future of Health Privacy June 6-7 in Washington, DC–attending or watching via live streamingvideo is free: http://tiny.cc/p4fqew Security technologies are critical for privacy—see top US computer scientists discuss “ideal” technologies for health data privacy and security.

Texas Error Exposed Over 13 Million Voters’ Social Security Numbers

See the full article in DataBreaches.net: Texas Error Exposed over 13 Million Voters’ SSNs

This story shows it’s easy to disclose the social security numbers of 13 million people at once. The data came from Texas’ voter registration data base, which was attached to a court report, BUT security breaches of the personal health information of millions of patients is also very common (see recent Utah and BCBS of TN breaches). Today’s electronic systems enable many new ways to breach data security and expose personal information.

The story below is about a government employee who attached over 13 million SSNs to a report and sent it to a 3rd party without anyone else reviewing his/her actions before the data was disclosed.  Where should the bar be set for disclosing personally identifiable information in any report?  At 1 million records? At 100 million records?

Most of the US health care system lacks effective protocols and procedures to protect data security and to prevent inappropriate data release and data breaches. Health data privacy and security require comprehensive and meaningful protections. We have a long way to go. Vastly expanding health IT systems before these problems are solved is a prescription for more data

Re: Genetic Bar Code Search – Finding People in Huge Gene Pools

In response to the PopSci.com article: Genetic Bar Code Search Can Use RNA to Pick Out Individuals From Huge Gene Pool

Quote from the principle investigator of the Mount Sinai study: “Rather than developing ways to further protect an individual’s privacy given the ability to collect mountains of information on him or her, we would be better served by a society that accepts the fact that new types of high-dimensional data reflect deeply on who we are,” he said. “We need to accept the reality that it is difficult—if not impossible—to shield personal information from others. It is akin to trying to protect privacy regarding appearances, for example, in a public place.”

Genetic privacy may be difficult to achieve, but it remains essential for people to trust physicians, researchers, health IT, and the government.

The public will not accept the idea that genetic information “is in the public domain” anytime soon. We never agreed to have our genetic information made public, and have fought for years to preserve genetic privacy at the state and federal levels. Those who built systems to take blood and tissue and do research without consent could have easily anticipated massive public concerns about such unethical research practices–and not built systems that violate Americans’ expectations and strong rights to health privacy.

Clearly it’s time for Congress to pass a federal law restoring personal ownership and control over blood and tissue that leaves our bodies, and restore the right of informed consent before any research can be done using our blood, tissue, or health information.

Re: Health Industry Under-Prepared to Protect Patient Privacy, Says PwC Report

In response to the Security Week article: Health Industry Under-Prepared to Protect Patient Privacy, Says PwC Report

The US is facing an unprecedented privacy crisis. The healthcare industry is extremely negligent about protecting data security and privacy (patient consent). At the same time 3/4 of the healthcare industry further risks patient privacy by selling or intending to sell data for secondary uses. Data theft and sales are driven in large part because, “Digitized health data is becoming one of the most highly valued assets in the health industry.”

  • Sixty-one percent of pharmaceutical and life sciences companies, 40 percent of health insurers, and 38 percent or providers currently share information externally. Of those organizations that share data externally, only two in five pharmaceutical and life sciences companies (43 percent) and one in four insurers (25 percent) and providers (26 percent) have identified contractual, policy or legal restrictions on how the data can be used.
  • Most corporations using patient data lack an effective consent process, “Only 17 percent of providers, 19 percent of payers and 22 percent of pharmaceutical/life sciences companies have a process in place to manage patients’ consent for how their information can be used.”

It’s a double whammy—not only is sensitive health information at high risk of misuse, sale, and breach INSIDE healthcare organizations, it’s also sold to OUTSIDE organizations that lack effective security and privacy measures.

  • “Nearly three quarters (74 percent) of healthcare organizations surveyed said they already do or intend to seek secondary uses for health data; however, less than half have addressed or are in the process of addressing related privacy and security issues.”

PriceWaterhouseCoopers surveyed 600 executives from US hospitals and physician organizations, health insurers, and pharmaceutical and life sciences companies. Data security and privacy practices were abysmal despite new enforcement efforts by the Administration, and despite hundreds of major data breaches compromising the privacy of millions of Americans.

Why aren’t Congress and the public outraged that the privacy and security of health information is so bad? If the banking industry operated like this there would be MAJOR oversight hearings and new laws.

The idea that today’s electronic healthcare systems and data exchanges safeguard health data is simply wrong. Clearly federal and state oversight and penalties for failure to protect the most sensitive personal data on earth need to be increased.

Physician’s computers were stolen

See the full story from MySanAntonio.com: “Physician’s computers were stolen

“Five computers containing medical and personal information of more than 3,000 patients were stolen from a Stone Oak physician’s office in October.

Dr. Sudhir Gogu of the Stone Oak Urgent Care & Family Practice said the computers were stolen after an office door had been pried open sometime during the weekend of Oct. 22-23, according to the police report.

A San Antonio Police Department spokesman said in an email Wednesday that the computers have not been recovered and there have been no arrests…

…Dr. Deborah Peel, founder and chairman of Patient Privacy Rights, an organization focused on putting people in control of their electronic health information, called medical identity theft a dangerous crime.

“It typically costs the average victim at least $20,000, and health plans typically increase your premiums … or may even cancel your coverage,” Peel said.

Peel criticized the health industry for failing to taken data protection seriously.

“It’s estimated that 80 percent of hospitals don’t encrypt data,” she said. “Can you imagine if your banks didn’t encrypt and keep your financial information secure? We wouldn’t even let them be banks.””

Re: Study shows privacy of medical records is weaker in the U.S.

A study of US and EU health data protections in the Journal of Science & Technology Law concluded Americans “have no real control over the collection of sensitive medical information if they want to be treated.”

Wow! It’s great to see legal scholars second the message that Americans’ rights to health privacy were eliminated.

You can see the article on the study in The Epoch Times here, written by Mary Silver.

For years, Patient Privacy Rights and the bipartisan Coalition for Patient Privacy were the lone voices carrying this message to Congress and the public.

Public and expert support to restore control over sensitive health data will only build. Soon, no one will buy the argument that privacy is an obstacle to electronic health systems.

Here are some other key quotes from the story:

  • “EU countries have adopted electronic health records and systems, or EHRs, and legally protected privacy at the same time.”
  • “The 1950 Council of Europe Convention identified individual privacy as a fundamental value”
  • “the good aspects of EHRs can be undermined by the bad consequences of poor privacy practices and the ugly effects of inadequate security”
  • “patient privacy is much better protected in Europe”
  • “European patients are able to encapsulate particularly sensitive medical information, and an individual has far greater access to and control over his records in Europe than in America.”

So, again why is the US government rushing to spend $29 billion on health IT systems that offer neither privacy nor security?

Re: SAIC Hit With $4.9B Lawsuit Over TRICARE Data Theft

See article for reference from NextGov, “SAIC Hit With $4.9B Lawsuit Over TRICARE Data Theft,” by Bob Brewin.

We can expect to see many more lawsuits over breaches because most US health systems have abysmal data security and by design allow thousands of employees to access the sensitive health information of millions of patients. This immense scale of damage was simply impossible with paper systems.

Ironclad security is very difficult technically (think WikiLeaks) because health systems were architected to enable ‘open access’ by hundreds or thousands of employees to millions of sensitive health records.

Today, the only ‘barrier’ to health data access in the US are ‘pop-up’ screens that ask, “Do you have a right to access this patient’s information?’ This is hardly effective. Yes, of course, after-the-fact audit trails of access can be used to identify those who should not have seen a record. It is a very weak kind of data protection; in fact, today patients identify the majority of data breaches, not health IT systems.

When will the US get serious about building privacy-enhancing architectures where ONLY clinical staff or others who are directly involved in a patient’s care can access the patient’s data with informed consent. Systems that prevent access by MOST employees could prevent the vast majorities of data breaches and data thefts.

Using and building systems designed for privacy would be a FAR better use of the stimulus billions than how they are currently being spent: to buy and promote the use of HIT systems that cannot possibly protect health data from misuse and theft, and in fact is designed to spread health information to many unseen and unknown secondary corporate and government users.