Leader of Hospital Identity Theft Ring Sentenced

It’s impossible to stop the tsunami of fraud, ID theft, and medical ID theft until we rebuild US health IT systems to prevent open access to millions of patient records by thousands of hospital and insurance company employees.
Systems should be re-built to allow ONLY those few people who are directly involved with a patient’s treatment to access their health records.

  • ·         ONLY those who carry out the orders of the patient’s physician should be able to access that patient’s electronic health records
  • ·         the other hundreds or thousands of hospital system employees and staff members should not be physically or technically able to access that patient’s records
  • ·         When a patient is admitted, one physician is in charge of diagnosis and treatment.
  • ·         All people the attending physician orders to treat the patient (nurses, consultants, respiratory therapists, etc, etc) work for that physician, the “captain of the ship”

Health data cannot possibly be protected when thousands of people have access to millions of patient records.  Employees of the hundreds of separate health technologies used by every hospital also have open access to millions of patient records.
The more people have access to sensitive personal health data, the easier it is to steal, sell, misuse it.

May 15, 2013 Health Care Symposium – Dialogue on Diversity

PPR Founder Deborah C. Peel, MD Joins Experts at
Dialogue on Diversity’s Health Care Symposium 2013

The Elusive Concept: Health Care a $15 Tr. Economy Can “Afford”

On May 15, 2013, Dr. Deborah Peel will join other experts in Washington, DC for the Health Care Symposium 2013, “The Elusive Concept: Health Care a $15 Trillion Economy Can “Afford.” During the complimentary lunch, the Honorable Donna M. Christian-Christensen will receive Dialogue on Diversity’s Health Leadership Award, followed by Dr. Deborah Peel’s panel.

Registration is free to the public, and a complimentary breakfast will also be provided. See the full agenda with specific times here.

The day begins with focused discussions on the laws of health care as well as the rising costs, followed by a panel on food and nutrition and the need for preventative strategies. After the lunch panel, experts will discuss cultural competency and class and ethnic access disparities. The day will close with a discussion on the  chief medical threats in the United States, such as Cancer, AIDS, and Obesity.

See more on sessions and speakers in this Press Advisory.

For the past two years, Dialogue on Diversity has worked with PPR as a member of the Coalition for Patient Privacy as well as a Consumer Partner of the Health Privacy Summit.

WHAT:

Health Care Symposium 2013
The Elusive Concept: Health Care a $15 Tr. Economy Can “Afford”
WHEN:
Wednesday, May 15th, 2013 | 8:30 a.m. – 3:30 p.m. ET
WHERE:

The American Federation of Teachers
555 New Jersey Avenue, N.W.
Washington, DC 20001

Patient Privacy Rights hires CTO

From the article and Q&A by Diana Manos in Health Care IT News: Patient Privacy Rights hires CTO

“Patient Privacy Rights appointed Adrian Gropper, MD as its first chief technology officer. Gropper is an expert in the regulated medical device field, an experienced medical informatics executive, and he has a long record of contributing to the development of state and national health information standards, according to a PPR news release.

Gropper, who has worked with federal initiatives and the Markle Foundation to help create the Direct Project’s secure email system and Blue Button technologies says he joins PPR because the challenges of runaway costs and deep inequities in the U.S. health system call for new information tools and inspired regulation.

“PPR’s deep respect for the medical profession and our total dedication to the patient perspective form the foundation for a series of policy and practice initiatives to shape health reform and electronic health systems,” Gropper said in the news release. “As a member of the PPR team, I look forward to driving a national consensus on the most difficult issues in the information age, including respectful patient identity, trustworthy consent, research acceleration, and effective public health.”

According to PPR, Gropper is a pioneer in privacy-preserving health information technology going as far back as the Guardian Angel Project at MIT in 1994. As CTO of one of the earliest personal health records companies, MedCommons, he actively participated in most of the PHR policy and standards initiatives of the past decade.”

See the full Q&A
See PPR’s Press Release

Re: Poor Prognosis for Privacy

In response to The Wall Street Journal article by Melinda Beck: Poor Prognosis for Privacy

Most healthcare institutions and John Halamka ignore the fact that for over a decade technology has empowered millions of patients to control which parts of their electronic health records are disclosed for mental health and addiction treatment. The technology for ‘segmentation’ exists.

Congress, the courts, state and federal laws, and medical ethics require that patients control who can see and use sensitive personal health data, yet federal regulators who write the rules for industry have not required electronic health systems to use either ‘segmentation’ or other technologies like meta-data tagging that could also enable selective disclosures of health information.

When the public finds out they can’t control the use or disclosure of sensitive personal health data, many millions will refuse early diagnosis and treatment for cancer, depression, and STDs every year—and millions more will hide information, refuse tests, and act in ways that put their health at risk. These are bad outcomes.

Should the public be forced to use health technology systems that cause bad outcomes? Why not require technology that IMPROVES health outcomes?

Employees’ unhealthy habits have growing effect on their insurance premiums

The story below concludes that “Employees now contribute 42 percent more for health care than they did five years ago.”   Just because employees are stuck paying higher healthcare bills doesn’t necessarily mean they are causing costs to increase.

If employees were driving up healthcare costs, then using financial penalties to force them to undergo intrusive health screenings and join wellness programs might make sense.

But employees aren’t causing the high costs of healthcare in the US.  Time magazine concluded that healthcare corporations, such as hospitals and the pharmaceutical industry, outpatient procedures, and lobbying costs are the main culprits.

Time magazine’s issue titled “Bitter Pill, why medical bills are killing us” identified several factors in high US healthcare costs:

The article below quotes the National Business Group on Health (NBGH), a lobbying group with assests of $18,772,047 in 2011. The NBGH blames employees for rising healthcare costs, instead of its many healthcare corporation members.

  • -URL for NBGH members: https://www.businessgrouphealth.org/join/members.cfm
  • -Blaming employees allows the NBGH to defend using coercive, intrusive wellness programs even for employees with complex, hard-to-manage illnesses, that wellness programs don’t help:
    • -See “Wellness Incentives In The Workplace: Cost Savings Through Cost Shifting To Unhealthy Workers” By Jill R. Horwitz, Brenna D. Kelly, and John E. DiNardo. Health Affairs, 32, no.3 (2013):468-476; doi: 10.1377/hlthaff.2012.0683; http://content.healthaffairs.org/content/32/3/468.full.html

Meanwhile screening companies, labs, and wellness programs collect sensitive employee health information and control its use, disclosure, and sale.

  • -There is no ‘chain of custody’ for health data so employees have no way to know who sees their health information.
  • -The US has NO data map to track the thousands of hidden companies that collect, use, or sell Americans’ personal health information.
  • -Corporations that collect employees’ health information treat it as a corporate asset, not as sensitive personal information that patients have strong rights to control.
  • -So it’s impossible to verify whether the NBGH lobbyist’s statement that “few employers would risk intentionally misusing such information” is true or false.

Blaming people who are sick for the high costs of their medical care instead of the corporations that overcharge is a really neat trick. It also provides a rationale for coercing employees to enter wellness programs and violating their rights to health privacy.

Unfortunately, simply “blaming the victims” won’t solve escalating healthcare costs.  We have to look broadly at individuals, the entire healthcare system, the food-chain, and larger cultural factors to identify and deal with all the real causes.

athenahealth and Mashery team up for health developer-friendly API initiative

To view the full article, please visit athenahealth and Mashery team up for health developer-friendly API initiative.

Electronic health records (EHRs) companies allow access to patients sensitive health data and sensitive information about physicians’  practices so technology companies can develop applications.

Applications have the potential to be useful to physicians and patients but at what cost to privacy? Will EHR “apps” secretly collect and sell people’s information the way Smartphone apps collect and sell contact, GPS data and more?  We now know the business model for many technologies is selling intimate personal data.

Quotes:

  • ·athenahealth will open “access to doctors’ appointment data, patient’s medical history (anonymized) , billing information and more”,
  • ·“the company hopes developers will be able to create an ecosystem of apps on top of athenahealth’s EMR service”
  • ·“Other EMR providers, including Allscripts and Greenway, have also opened up their APIs to developers and created app marketplaces.”

The press release on this athenahealth project stated, We’re providing the data and knowledge from our cloud-based network, a captive audience for developers to innovate for, and an online sandbox to do it all in.”

  • ·Who are the “captives”? athenahealth’s 40,000 physicians and their 100’s of thousands of patients

QUESTIONS:

  • ·When were the “captive” patients asked for consent for strangers who want to use and monetize their health records?
  • ·When were “captive” physicians asked consent for strangers to use information about their practices, what they charge, who they treat, how they treat patients, how they are paid by whom, and much more?
  • ·Why does athenahealth claim that patient data is “anonymized”—-when its impossible to prevent “anonymized” patient records from easy re-identification?

Many electronic health record (EHR) companies allow access/or sell sensitive patient data to technology developers and other companies.

BROADER QUESTIONS

  • ·When did the public learn about, debate, or agree to the use of their sensitive patient data by technology companies to build products?
  • ·Why do technology companies claim that “anonymization” and “de-identification” of health data works, when computer science has clearly proved them wrong?
  • ·How is the identifiable health data of hundreds of thousands of patients protected from any OTHER uses the technology developers decide to use it for?
  • ·How can the public weigh the risks and harms vs. benefits of using EHRs when there is no ‘chain of custody’ for our health data and no data map that tracks the thousands of HIDDEN users of our personal health information?
  • See Harvard Prof Latanya Sweeney explain the need for a data map at: http://tiny.cc/5pjqvw
    • -Attend or watch via live-streaming video the 2103 International Summit on the Future of Health Privacy in Washington DC June 5-6 to see the first data map Prof Sweeney’s team has built. Registration to attend or watch is free at: www.healthprivacytsummit.org

Mostashari, policy committee take critical look at CommonWell

To view the full article, please visit: Mostashari, policy committee take critical look at CommonWell

The ONLY way patients/the public will trust health technology systems is if THEY control ‘interoperability’—-ie if THEY control their sensitive health data. Patients have strong rights to control exactly who can collect, use, and disclose their health data. This also happens to be what the public expects and wants MOST from HIT……The public has strong legal rights to control PHI, despite our flawed HIT systems.

The story below is about an attempt by large technology vendors and the government to maintain control over the nation’s sensitive health data. Institutional/government-sanctioned models like the CommonWell Alliance violate patients’ rights to control their medical records (from diagnoses to DNA to prescription records).  Patients should be able to:

  • -choose personal email addresses as their IDs, there is no need for Institutions to choose ID’s for us—email addresses on the Internet work very well as IDs
  • -download and store their health information from electronic records systems (EHRs)–required by HIPAA since 2001, but only now becoming reality via the Blue Button+ project
  • -email their doctors using Direct secure email

Today’s systems violate 2,400 years of ethics underlying the doctor-patient relationship and the practice of Medicine: ie Hippocrates’ discovery that patients would only be able to trust physicians with deeply personal information about their bodies and minds IF the doctors never shared that information without consent. That ‘ethic’—-ie, to guard the patient’s information and act as the patient’s agent and protector is codified in the Hippocratic Oath and embodied in American law and the AMA Code of Medical Ethics. Americans have strong rights to health information privacy which HIPAA has not wiped out (HIPAA is the FLOOR, not the CEILING for our privacy rights).

The public does NOT agree that their sensitive health data should be used without consent—they expect to control health information with rare legal exceptions. See: http://patientprivacyrights.or…. HUGE majorities believe that individuals alone should decide what data they want to share with whom—not one-size-fits-all law or policies.

Nor does the public agree to use of their personal health data for “research”—whether for clinical research about diseases or by industry for commercial use of the data via the ‘research and public health loopholes’ in HIPAA. Only 1% of the public agrees to unfettered use of personal health data for research. Read more about these survey results here.

The entire healthcare system depends TOTALLY on a two-person relationship, and whether there is trust between those two people. We must look at the fact that today’s HIT systems VIOLATE that personal relationship by making it ‘public’ via the choice of health technology systems designed for data mining and surveillance. Instead we need technology designed to ensure patient control over personal health information (with rare legal exceptions). When patients cannot trust their doctors, health professionals, or the flawed technology systems they use, the consequence is many millions of patients avoid or delay of treatment and hide information. Every year many millions of Americans take actions which CAUSE BAD OUTCOMES.

Current health technologies and data exchange systems cause millions of people annually to risk their health and lives, ie the technologies we are using now cause BAD OUTCOMES.

We have to face facts and design systems that can be trusted. Patient Privacy Rights’ Trust Framework details in 75 auditable criteria what it takes to be a trusted technology or systems. See:http://patientprivacyrights.or… or download the paper at:
http://ssrn.com/abstract=22316…

Sensitive data still pose special challenges

At a recent meeting of the National Health IT Policy Committee, the CEO of a large electronic health records (EHR) corporation said technology for “data segmentation”—which ensures patients control who sees and uses sensitive data—is something “vendors don’t know how to do.”  But that simply isn’t true. Vendors do know how to build that kind of technology, in fact it already exists.

At the same meeting, the National Coordinator for Health IT recognized the Department of Veterans Affairs and the Substance Abuse and Mental Health Services Administration for their “demonstration of technology developed for data segmentation and tagging for patient consent management”, but he seemed to forget that millions of people receiving mental health and addiction treatment have been using EHRS with consent and data segmentation technologies for over 12 years. Again, the technology already exists.

Facts:

  • -Technology is NOT the problem—it’s not too hard or too expensive to build or use consent and data segmentation technologies.
  • -Data segmentation and consent technologies exist:  the oldest example is EHRs used for millions of mental health and addiction treatment records for the past 12 years.
  • -All EHRs must be able to “segment” erroneous data to keep it from being disclosed and harming patients—that same technology can be used to “segment” sensitive health data.
  • -Data segmentation and consent technologies were demonstrated ‘live’ at the Consumer Choices Technology Hearing in 2010. See a video: http://nmr.rampard.com/hit/20100629/default.html
  • -Starting in 2001, HIPAA required data segmentation and consent technology for EHRs that keep “psychotherapy notes” separated from other health data.  “Psychotherapy notes” can ONLY be disclosed with patient permission.
  • -The 2013 amendments to HIPAA require EHRs to enable other situations where data must be segmented and consent is required. For example:
  • -If you pay out-of-pocket for treatment or for a prescription in order to keep your sensitive information private, technology systems must prevent your data from being disclosed to other parties.
  • -After the first time you are contacted by hospital fundraisers who saw your health data, you can opt-out and block the fundraisers from future access to your EHR.

The real problem is current  technology systems and data exchanges are not built to work the way the public expects them to—they violate Americans’ ethical and legal rights to health information privacy.

The public will discover that today’s health technologies and systems have fatal privacy flaws. The unintended consequence of using flawed technology is millions of people will avoid or delay treatment and hide information to keep their health information private and suffer from bad health outcomes.

US health technology should improve health and outcomes, not cause the health of millions to worsen.

How can the US fix the privacy flaws in health technology systems so EHRs and other health technologies can be trusted?

An American Quilt of Privacy Laws, Incomplete

The MOST “incomplete” US privacy law is HIPAA, which eliminated Americans’ rights to control the collection, use, disclosure and sale of their health data in 2001.

The new Omnibus Privacy Rule did not fix this disaster. It made things worse by explicitly permitting health data sales for virtually any purpose without patients’ consent or knowledge. These new regulations violate Congress’ intent to ban the sale of health data in the 2009 stimulus bill.

In addition to not being able to control personal health information Americans have no ‘chain of custody’ for their health data, so there is no way to know who is using or selling our health data.

We need a data map to track all the hidden users and sellers of our personal health information, from our DNA, to our diagnoses, to our prescription records:

  • -Watch Professor Sweeney describe the Harvard Data Privacy Lab/Patient Privacy Rights research project to track hidden users of our health data at: http://patientprivacyrights.org/thedatamap/
  • -WE NEED A DATA MAP TO SHOW THE GOVERNMENT IT’S TIME TO FIX THIS PRIVACY DISASTER!

Attend or watch the next health privacy summit June 5-6 in Washington, DC to learn about these urgent health data problems and potential solutions:

A new CVS wellness program raises privacy concerns

From the Thomson Reuters News & Insight article by Anna Louie Sussman, “A new CVS wellness program raises privacy concerns

(Reuters) – When nationwide pharmacy chain CVS Caremark Corp announced last week that its employees must submit to a medical exam or pay a $600 annual fine, some critics raised privacy concerns…

Under the CVS exam, which is free, tests will measure an employee’s weight, body fat, blood pressure, glucose levels and other health indicators. Workers who smoke must enroll in an addiction program by next year.

“They draw blood, that’s data collection. You have to go through a screening, that’s data collection. You have to call WebMD’s center, that’s data collection. People’s sensitive health data is being used for commercial purposes,” said Dr. Deborah Peel, founder of the advocacy organization Patient Privacy Rights.