2012 Sets New Record for Reported Data Breaches

Please view the full report at 2012 Sets New Record for Reported Data Breaches

Everyone knows that securing data is hard, but in healthcare much is still not even encrypted. 2012 broke the record for the most data breaches.

  • -”With 2,644 incidents recorded through mid-January 2013, 2012 more than doubled the previous highest year on record (2011)”

“The latest information and research conducted by Risk Based Security suggests that organizations in all industries should be on notice that they face a very real threat from security breaches. Whether it is the constantly increasing security threats, ever-evolving IT technologies or limited security resources, data breaches and the costs related to response and mitigation are escalating quickly. Organizations today need timely and accurate analytics in order to better prioritize security spending based on their unique risks.”

Some key statistics:

“The Business sector accounted for 60.6 percent of all 2012 reported incidents, followed by Government (17.9%),Education (12.0%), and Medical (9.5%). The Business sector accounted for 84.7 percent of the number of records exposed, followed by Government (12.6%), Education (1.6%), and Medical (1.1%).”

“76.8% of reported incidents were the result of external agents or activity outside the organization with hacking accounting for 68.2% of incidents and 22.8% of exposed records in 2012. Incidents involving U.S. entities accounted for 40.7% of the incidents reported and 25.0% of the records exposed.”

Rekindling the patient ID debate

Unique patient identifiers pose enormous implications for patient control and privacy. Dr. Deborah Peel is quoted in this article explaining how detrimental UPIs will be for patient trust and safety. To view the full article, please visit Rekindling the patient ID debate.

Key Quotations:

“The idea of unique patient identifiers (UPIs) is not a concept extracted from the next dystopian novel. It could very well be reality in the not-so-distant future. The question remaining, however, is whether or not the benefits of such technology outweigh constitutional privacy and patient trust concerns.”

“Deborah Peel, MD, founder of Patient Privacy Rights, and a fierce opponent of UPIs, writes in a Jan. 23 Wall Street Journalarticle, ‘In the end, cutting out the patient will mean the erosion of patient trust. And the less we trust the system, the more patients will put health and life at risk to protect their privacy.’

Peel points to the present reality of patient health information – genetic tests, claims data and prescription records – already being sold and commercialized. ‘Universal healthcare IDs would only exacerbate such practices,’ she avers.”

Nearly Half of U.S. Adults Believe They Have Little To No Control Over Personal Info Companies Gather From Them While Online

To view the full article, please visit Nearly Half of U.S. Adults Believe They Have Little To No Control Over Personal Info Companies Gather From Them While Online.

No surprise, 80% of US adults do NOT want targeted ads. 24% think they have no control over information shared online.

How will US adults feel when they learn they have no control over sensitive electronic health information? Despite the new Omnibus Privacy Rule,  there is still no way we can stop our electronic health records from being disclosed or sold.  The only actions we can take are avoiding treatment altogether or seeking physicians who use paper records and paying for treatment ourselves. No one should be faced with such bad choices. There is no reason we should have to give up privacy to benefit from technology.

Today, the only way to prevent OUR health information from being disclosed or sold to hidden third parties is to avoid electronic health systems as much as possible. That puts us in a terrible situation, because technology could have been used to ensure our control over our health data. The stimulus billions can still be used to build trustworthy technology systems that ensure we control personal health information. Institutions, corporations, and government agencies should not control our records and should have to ask us for consent before using our them.

Quotes:

  • -”45% of U.S. adults feel that they have little (33%) or no (12%) control over the personal information companies gather while they are browsing the web or using online services such as photo sharing, travel, or gaming.”
  • -”many adults (24%) believe that they have little (19%) to no (5%) control over information that they intentionally share online”
  • -”one-in-five (20%) said that they only minimally understand (17%), or are totally confused (3%) when it comes to personal online protection”
  • -”When asked under what circumstances companies should be able to track individuals browsing the web or using online services, 60% say this should be allowed only after an individual specifically gives the company permission to do so.”
  • -”Just 20% of adults say that they want to receive personalized advertising based on their web browsing or online service use, while the large majority (80%) report that they did not wish to receive such ads.”

DNA records pose new privacy risks

To view the full article, please visit: DNA Records Pose New Privacy Risks

An article in the Boston Globe highlights the ease with which DNA records can be re-identified. According to the article, “Scientists at the Whitehead Institute for Biomedical Research showed how easily this sensitive health information could be ­revealed and possibly fall into the wrong hands. Identifying the supposedly anonymous research participants did not require fancy tools or expensive equipment: It took a single researcher with an Internet connection about three to seven hours per person.” Even truly anonymous data was not entirely safe from being re-identified. Yaniv Erlich”…decided to extend the technique to see if it would work with truly anonymous ­data. He began with 10 unidentified men whose DNA ­sequences had been analyzed and posted online as part of the federally funded 1,000 Genomes Project. The men were also part of a separate scientific study in which their family members had provided genetic samples. The samples and the donors’ relationships to one ­another were listed on a website and publicly available from a tissue repository.”

These findings are incredibly relevant because it is highly possible that “something a single researcher did in three to seven hours could easily be automated and used by companies or insurers to make predictions about a person’s risk for disease. ­Although the federal Genetic Information Nondiscrimination Act protects DNA from ­being used by health insurers and employers to discriminate against people”.

Can computers predict medical problems? VA thinks maybe.

To view the full article written by Bob Brewin for Nextgov, please visit Can computers predict medical problems? VA thinks maybe.

“The Veterans Health Administration plans to test how advanced clinical reasoning and prediction systems can use massive amounts of archived patient data to help improve care, efficiency and health outcomes.”

Two veterans commented on the story below:

  • -“total invasion of privacy, I have a big problem with a “vendor” going through my records let alone the VA. the VA doesnt exactly have a good track record of protecting information”
  • -“veterans are NO LONGER guinea pigs without express PRIOR written consent, that is MEDICAL DATA covered by HIPAA, and is expressly forbidden to be managed in an open fashion and is NOT for sale.”

Like 99% of Americans, these vets oppose research use of their health information without consent:

US health IT systems and the VA could offer electronic consent to participate in studies:

  • -Electronic consent tools can enable each patient to set his or her own broad rules to allow research use of their health data.
  • -Vets could be ‘pinged’ for consent for EACH study, set broad rules to allow use of data for all studies, or set their rules for something in between (such as: I will agree to all research use of my data on traumatic brain injury and PTSD, but contact me for consent for all other studies).

Unfortunately the new Omnibus Privacy Rule grants open access to all 300 million citizens’ sensitive health information without consent for any ‘research’ or ‘public health’ use.
The broad ‘research loophole’ in HIPAA and the new Omnibus Privacy Rule permits industry (corporations including insurers, employers, drug companies, marketers, pharmacies, labs, and others) to use and sell our personal data for “research” that we would never agree with. ‘Research’ is defined so broadly that:

  • -Blue Health Intelligence (a subsidiary of Blue Cross Blue Shield) does ‘research’. It uses and sells enrollees’ health data without consent.
  • -IMS Health data mines and sells the nation’s prescription records. Claiming to do ‘research’ allows IMS Health to use and sell Americans’ prescription records without consent.
  • -Many electronic health record companies (Cerner, GE Centricity, Greenway, Athena Health, and Practice Fusion) are also ‘research companies’ and sell health data.
  • -The ‘research’ industry sells data that is supposedly ‘de-identified’, but health data is easy to re-identify (See paper by Narayanan and Shmatikov:
  • http://www.cs.utexas.edu/~shmat/shmat_cacm10.pdf ). And there is no way to know when ‘de-identified’ data is re-identified. Texas law bans re-identification’ of health data, but the system depends on whistleblowers to report violations.
  • -Most ‘researchers’ are not physicians, scholars, and PhDs at academic centers, as the public assumes.

Why wouldn’t every corporation that touches health data declare itself a ‘research institution’ so it can collect, use, and sell Americans’ health data? Personal health information is THE MOST valuable data of all, but we have no way to control which corporations collect and use health data.
How large a part of the surveillance economy is personal health data?

Clouds in healthcare should be viewed as ominous- Quotes from Dr. Deborah Peel

A recent article in FierceEMR written by Marla Durben Hirsch quotes Dr. Peel about the dangers of cloud technology being used in healthcare. Dr. Peel tells FierceEMR that “There’s a lot of ignorance regarding safety and privacy of these [cloud] technologies”.

Here are a few key quotes from the story:

“It’s surely no safe haven for patient information; to the contrary it is especially vulnerable to security breaches. A lot of EHR vendors that offer cloud-based EHR systems don’t take measures to keep patient data safe. Many of them don’t think they have to comply with HIPAA’s privacy and security rules, and many of their provider clients aren’t requiring their vendors to do so.” (Hirsch)

“Many providers have no idea where the vendor is hosting the providers’ patient data. It could be housed in a different state; or even outside of the country, leaving it even more vulnerable. ‘If the cloud vendor won’t tell you where the information is, walk out the door,’ Peel says.”

“Then there’s the problem of what happens to your data when your contract with the cloud vendor ends. Providers don’t pay attention to that when they sign their EHR contract, Peel warns.”

“‘The cloud can be a good place for health information if you have iron clad privacy and security protections,’ Peel says. ‘[But] people shouldn’t have to worry about their data wherever it’s held.’”

OCR Could Include Cloud Provision in Forthcoming Omnibus HIPAA Rule

The quotes below are from an article written by Alex Ruoff in the Bloomberg Health IT Law and Industry Report.

“Deborah Peel, founder of Patient Privacy Rights, said few providers understand how HIPAA rules apply to cloud computing. This is a growing concern among consumer groups, she said, as small health practices are turning to cloud computing to manage their electronic health information. Cloud computing solutions are seen as ideal for small health practices as they do not require additional staff to manage information systems, Peel said.
Cloud computing for health care requires the storage of protected health information in the cloud—a shared electronic environment—typically managed outside the health care organization accessing or generating the data (see previous article).
Little is known about the security of data managed by cloud service providers, Nicolas Terry, co-director of the Hall Center for Law and Health at Indiana University, said. Many privacy advocates are concerned that cloud storage, because it often stores information on the internet, is not properly secured, Terry said. He pointed to the April 17 agreement between Phoenix Cardiac Surgery and HHS in which the surgery practice agreed to pay $100,000 to settle allegations it violated HIPAA Security Rules (see previous article).
Phoenix was using a cloud-based application to maintain protected health information that was available on the internet and had no privacy and security controls.

Demands for Guidance

Peel’s group, in the Dec. 19 letter, called for guidance “that highlights the lessons learned from the Phoenix Cardiac Surgery case while making clear that HIPAA does not prevent providers from moving to the cloud.”

Peel’s letter asked for:
• technical safeguards for cloud computing solutions, such as risk assessments of and auditing controls for cloud-based health information technologies;
• security standards that establish the use and disclosure of individually identifiable information stored on clouds; and
• requirements for cloud solution providers and covered entities to enter into a business associate agreement outlining the terms of use for health information managed by the cloud provider.”

Patient privacy group (PPR) asks HHS for HIPAA cloud guidance

Government HealthIT recently wrote an article about Dr. Peel’s of Patient Privacy Rights’ letter to the HHS Office for Civil Rights pushing for security guidelines, standards, and enforcements for cloud technology being used in healthcare.

Here are a few key points highlighted in the article:

“Issuing guidance to strengthen and clarify cloud-based protections for data security and privacy will help assure patients (that) sensitive health data they share with their physicians and other health care professionals will be protected,” Peel said.

“Cloud-computing is proving to be valuable, Peel said, but the nation’s transition to electronic health records will be slowed ‘if patients do not have assurances that their personal medical information will always have comprehensive and meaningful security and privacy protections.’”

“Patient Privacy Rights, a group founded in 2006, is encouraging HHS to adopt guidelines that highlight ‘the lessons learned from the Phoenix Cardiac Surgery case while making it clear that HIPAA does not prevent providers from moving to the cloud as long as it is done responsibly and in compliance with the law.’”

“In general, Peel said, cloud providers and the healthcare industry at large could benefit from guidance and education on the application of federal privacy and security rules in the cloud. ‘HHS and HIPAA guidance in this area, to date, is limited,’ Peel said, recommending the National Institute of Standards and Technology’s cloud privacy guidelines as a baseline.”

Health-care sector vulnerable to hackers, researchers say

From the Wall Street Journal article by Robert O’Harrow Jr. titled Health-care sector vulnerable to hackers, researchers say

“As the health-care industry rushed onto the Internet in search of efficiencies and improved care in recent years, it has exposed a wide array of vulnerable hospital computers and medical devices to hacking, according to documents and interviews.

Security researchers warn that intruders could exploit known gaps to steal patients’ records for use in identity theft schemes and even launch disruptive attacks that could shut down critical hospital systems.

A year-long examination of cybersecurity by The Washington Post has found that health care is among the most vulnerable industries in the country, in part because it lags behind in addressing known problems.

“I have never seen an industry with more gaping security holes,” said Avi Rubin, a computer scientist and technical director of the Information Security Institute at Johns Hopkins University. “If our financial industry regarded security the way the health-care sector does, I would stuff my cash in a mattress under my bed.””

Kravis Backs N.Y. Startups Using Apps to Cut Health Costs

The title should have been: “Wall Street trumps the Hippocratic Oath and NY patients’ privacy” or “NY gives technology start-ups free access to millions of New Yorkers sensitive health data without informed consent starting in February”.

Of course we need apps to lower health costs, coordinate care, and help people get well, but apps should be developed using ‘synthetic’ data, not real patient data. Giving away valuable identifiable patient data to app developers is very risky and violates patients legal and ethical rights to health information privacy under state and federal law—each of us has strong rights to decide who can see and use personal health information.

What happens when app developers use, disclose or sell Mayor Bloomberg’s, Governor Cuomo’s, Sec of State Hillary Clinton’s, or Peter Thiel’s electronic health records? Or will access to prominent people’s health records be blocked by the data exchange, while everyone’s else’s future jobs and credit are put at risk by developer access to health data?  Will Bloomberg publish a story about the consequences of this decision by whoever runs the NY health data exchange? Will Bloomberg write about the value, sale, and massive technology-enabled exploitation of health data for discrimination and targeted marketing of drugs, treatments, or for extortion of political or business enemies? Natasha Singer of the NYTimes calls this the ‘surveillance economy’.

The story did not mention ways to develop apps that protect patients’ sensitive information from disclosure to people not directly involved in patient care. The story could have said that the military uses “synthetic” patient data for technology research and app development. They realize that NOT protecting the security and privacy of sensitive data of members of the military and their families creates major national security risks.  The military builds and tests technology and apps on synthetic data; researchers or app developers don’t get access to real, live patient data without tough security clearances and high-level review of those who are granted permission to access data for approved projects that benefit patients. Open access to military health data bases threatens national security. Will open access to New Yorkers’ health data also threaten national security?

NY just started a national and international gold rush to develop blockbuster health apps AND will set off a rush by other states to give away or sell identifiable patient health information in health information exchanges (HIEs) or health information organizations (HIOs)—-by allowing technology developers access to an incredibly large, valuable data base of identifiable patient health information.  Do the developers get the data free—or is NY selling health data? The bipartisan Coalition for Patient Privacy (represents 10.3M people) worked to get a ban on the sale of patient health data into the stimulus bill because the hidden sale of health data is a major industry that enables hidden discrimination in key life opportunities like jobs and credit. Selling patient data for all sorts of uses is a very lucrative industry.

Further, NY patients are being grossly misled: they think they gave consent ONLY for their health data to be exchanged so other health professionals can treat them. Are they informed that dozens of app developers will be able to copy all their personal health data to build technology products they may not want or be interested in starting in February?

Worst of all the consequences of systems that eliminate privacy is: patients to act in ways that risk their health and lives when they know their health information is not private:

  • -600K/year avoid early treatment and diagnosis for cancer because they know their records will not be private
  • -2M/year avoid early treatment and diagnosis for depression for the same reasons
  • -millions/year avoid early treatment and diagnosis of STDs, for the same reason
  • -1/8 hide data, omit or lie to try to keep sensitive information private

More questions:

  • -What proof is there that the app developers comply with the contracts they sign?
  • -Are they audited to prove the identifiable patient data is truly secure and not sold or disclosed to third parties?
  • -What happens when an app developer suffers a privacy breach—most health data today is not secure or encrypted? If the app developers signed Business Associate Agreements at least they would have to report the data breaches.
  • -What happens when many of the app developers can’t sell their products or the businesses go bust? They will sell the patient data they used to develop the apps for cash.
  • -The developers reportedly signed data use agreements “covering federal privacy rules”, which probably means they are required to comply with HIPAA.  But HIPAA allows data holders to disclose and sell patient data to third parties, promoting further hidden uses of personal data that patients will never know about, much less be able to agree to.  Using contracts that do not require external auditing to protect sensitive information and not requiring proof that the developers can be trusted is a bad business practice.

NY has opened Pandora’s box and not even involved the public in an informed debate.