Is Your Personal Health Data For Sale?

By Pat Anson, PNN Editor

Many U.S. consumers believe their personal health information is protected under the Health Insurance Portability and Accountability Act (HIPPA), a federal law that requires healthcare providers and insurers not to share a patient’s sensitive health information without their consent or knowledge.

A new study on consumer data brokers and a federal complaint against a popular drug discount service show otherwise, with patient names, social security numbers, email addresses, prescription drug use and other personal information routinely being sold to third parties.

The Duke University study on data brokers focused only on mental health records, but gives you a good idea of what’s available on the open market. When researcher Joanne Kim contacted 37 data brokers asking to buy mental health data on millions of patients, 11 of them offered to sell her the requested data, which included information about whether an individual was being treated for depression, anxiety or insomnia, and if they were prescribed drugs such as Prozac or Zoloft.

The asking price for the information was relatively cheap, with one broker offering data on 10,000 aggregated patient records for $2,000 – or 20 cents per record. The cost was even cheaper if the data was ordered in volume; 435,780 records were available for 6 cents each.

Many of the brokers did not provide Kim with a full explanation about their data or where it came from, making it difficult to determine whether the company was offering “deidentified” information. Some firms openly advertised data that included individual names, addresses, phone numbers and emails. One broker even offered to sell her the IP addresses and browser history of patients.

“This research highlights a largely unregulated data brokerage ecosystem that sells sensitive mental health data in large quantities, with either vague or entirely nonexistent privacy protections,” Kim wrote in her report. “Data brokers are collecting, aggregating, analyzing, circulating, and selling sensitive mental health data on individuals. This comes as a great concern, especially since the firms seem either unaware of or loosely concerned about providing comprehensive privacy protections.”

Due to the stigma associated with mental health problems, Kim says the easy availability of personal health data puts millions of patients at risk of discrimination from employers and insurers, or even theft from scammers who prey on vulnerable populations.

“The nation is in dire need of a comprehensive federal privacy law, and this report recommends that the federal government should also consider generally banning the sale of mental health data on the open market,” she wrote. “Such a law should include provisions that allow consumers to opt out of the collection of their data, gain access to their information, and correct any discrepancies. Furthermore, data brokers should be obligated to be more transparent about their use and exchange of data, as well as have more controls in place for client management.”

One potential “client” that Kim doesn’t mention is law enforcement. In 2020, the Drug Enforcement Administration asked data brokers to submit bids on a potential contract for a surveillance program that would track at least 85% of U.S. prescriptions for opioids and other controlled substances. The DEA was seeking “unlimited access” to this prescription data, including the names of prescribers and pharmacists, types of medication, quantity, dose, refills and forms of payment.

While the contract was never awarded, it remains unclear what the DEA planned to do with the information or if it has found other ways to collect the data.

GoodRx Settlement

Where and how is personal health data collected? It could be as simple as a consumer trying to save money on medications.

The Federal Trade Commission recently reached a $1.5 million settlement with prescription drug discount provider GoodRx for failing to notify consumers that it was selling their information to Facebook, Google and other third parties for advertising purposes.

GoodRx offers considerable savings to patients who enroll in its free drug discount program, and makes money by selling their health and contact information to third parties. For example, according to the FTC complaint, GoodRx shared patient health data with Facebook, which then targeted them with advertisements for specific drugs to treat their health conditions.

“GoodRx’s sharing of personal and health information has revealed highly sensitive and private details about its users, most of whom suffer from chronic health conditions. This has led to the unauthorized disclosure of facts about individuals’ chronic physical or mental health conditions, medical treatments and treatment choices, life expectancy, disability status, parental status, substance addiction, sexual and reproductive health, and sexual orientation, as well as other information,” the FTC said.

“Disclosure of this information without authorization is likely to cause GoodRx users stigma, embarrassment, or emotional distress, and may also affect their ability to obtain or retain employment, housing, health insurance, disability insurance, or other services.”

In a press release, GoodRx said the FTC was focusing on an “old issue” that it addressed and corrected three years ago. “Millions of Americans use GoodRx to save on their healthcare, and we take strong measures to ensure they can trust us with their information,” the company said.

Data mining isn’t limited to healthcare providers, advertisers, internet companies or law enforcement. Medical researchers also use it, to track and evaluate patient conditions and the effectiveness of treatments. Some would also like to use data to predict patient outcomes.

In a new study, researchers at the University of Alberta said they had devised a form of artificial intelligence -- based on patient health data -- that can predict with 90% accuracy whether a patient is at risk of an adverse outcome from opioid prescriptions. Researchers say their model could be used someday to warn doctors about high-risk patients, so they can prescribe another drug or give smaller doses.