Close up of person using a laptop with a green stethoscope sitting nearby.

Sensitive Data: Sold

Posted

in

by

The government buying data revealing “where you’ve been…who you’re connected to….[and] the nature of your beliefs and predictions about what you might do in the future.” Mental health data, including what you’re diagnosed with, what you’re likely to be diagnoses with, and what meds you take, being freely available for purchase by public agencies. It all sounds like something out of a dystopian fantasy. Unfortunately, this isn’t sci-fi, as an article from Anne Toomey McKenna and research done by Joanne Kim help to explain; this is our reality, and the laws are far, far behind.

What are they selling, and who’s buying?

McKenna states that commercially available data (personal information collected from an array of sources by data brokers who aggregate it and sell it to others) can include data that is private, confidential, or otherwise legally protected. Types of data include gender, sexual orientation, location, religious views, weight, blood pressure, behavioral information, and family and friends. Kim, in her research that focuses on the sale of Americans’ mental health data, talks about even more available data. Data brokers she researched well willing to sell, among many other pieces of data, things such as diagnosis of a mental health condition, likelihood of having depression (according to some test score), data from wearable medical devices, general hospital systems data, ability to pay for healthcare, and “total cost risk scores” (how much a person would cost to the healthcare system over a few months.”

Advertisers are eager to buy up data such as this, using it for very targeted marketing. Health insurance providers could conceivably buy medical data and discriminately charge individuals more for care or target vulnerable populations with advertisement, according to Kim. She also talks about the risk of scammers buying this data and using it to exploit and steal from those living with mental health conditions. But advertisers and scammers aren’t the only customers: governments are finding it easier and cheaper to buy up commercially available info than to deal with the restrictions they’d face trying to get data themselves. It’s harder for the government to wire tap, track you, or search your cellphone than it is for them to just buy this information from someone else.

Is that…legal?

Well…yes. Pretty much? The information sold often a mix of private and/or protected data…but there’s a lack of federal regulation around data that creates a convenient loophole for agencies, especially government agencies. The lack of legislation also seems to be working well for advertisers and the data brokers profiting from those buying their information. Win for them, huge loss for the American public who is having their data sold.

Diving specifically into the mental health data example, Kim talks about how most mental health apps (a source of much of this information) are not covered by HIPAA (Health Insurance Portability and Accountability Act) regulations, nor are wearables or social media platforms. It’s completely legal to sell this data, even though Kim argues that it shouldn’t be.

Dangers of selling sensitive data

I’ve already highlighted some examples of what can happen when people buy sensitive data, but there are countless more “what ifs.” McKenna say it’s possible for the government to use location data bought broke data brokers to prosecute someone for abortion. Mental health data can be used to discriminate against employees. Sure, such a thing is illegal, but how would an employee know if it happened? If an employer saw they had a diagnosed mental illness and decided not to hire them? Or to let an existing employee go? Or insurance companies raising your rate because they see you are susceptible to depression…the list goes on. Individuals with mental illness already face stigma and obstacles to getting care, especially those who are part of marginalized groups. This data abuse would only make their situation worse.

Legal defenses

Federal regulation on this topic is far behind, as it is with most privacy concerns. But there are ways that Congress could address the problems, according to McKenna. There is a bipartisan proposal for a Nation AI Commission, as well as a proposed AI regulation framework. Kim states that the U.S. needs a comprehensive federal privacy law. Such a law should include a way for customers to opt out of data collection and access their information. Data brokers, too, should be obligated to be transparent about their use and exchange of data. But while we’re waiting for a federal privacy law, she suggests an alternative: extending HIPAA’s protections to include emerging mental health technologies and the data economy.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *