The Hidden Cost of Teletherapy

• Bookmarks: 749


Teletherapy became increasingly popular during the COVID-19 pandemic. In 2021, an estimated 21% of U.S. adults used a teletherapy service, while mental health startups collectively raised $5.5 billion in funding. Websites such as BetterHelp, Cerebral, Ginger, ReGain, and TalkSpace, among others, connect users with therapists for virtual counseling sessions. Although some benefits of in-person therapy are lost in the shift to digital, teletherapy offers added convenience and flexibility for patients with access to a computer or smartphone and stable internet.

But the convenience of teletherapy comes at a cost: your mental health data. Teletherapy platforms collect a range of intimate information about users’ mental health, such as whether a user has been to therapy before or has had suicidal thoughts. This data is then shared with third parties, including social media companies, advertising technology companies, and data brokers.

Dozens of telehealth websites have been found to send user data, including URLs visited, full name, email, phone number, when a user initiated checkout, when a user added to their cart, when a user created an account, and users’ answers to health questionnaires, to Google, Facebook, Bing, TikTok, Snapchat, LinkedIn, Pinterest, and Twitter. One platform focused on substance abuse was found to use Meta’s pixel tracking tool to send identifiable user responses about self-harm, drug, and alcohol use to Facebook. Numerous websites tied to the national 988 Suicide and Crisis Lifeline were also found to have sent callers’ personal data to Facebook using the Meta Pixel. BetterHelp, often hailed as the “top” teletherapy provider, was recently fined $7.8 million by the Federal Trade Commission for deceiving consumers after promising to keep sensitive personal data private.

Once shared, data obtained by third-party data brokers continues to be sold. A February 2023 study by researchers at Duke University’s Sanford School of Public Policy illustrates the gravity of the unregulated data trade when it comes to mental health data, finding that many data brokers are marketing highly sensitive information on people’s mental health conditions. While some is aggregated and anonymized, some of this data is personally identifiable, including names, addresses, and incomes.

At the heart of this issue is the limited scope of the Health Insurance Portability and Accountability Act (HIPAA), which only applies to specific “covered entities,” defined as healthcare providers like hospitals, medical clinics, and health insurance companies. Enacted in 1996, the law prevents individuals’ sensitive health information from being shared without their knowledge. Specifically, the Privacy Rule within HIPAA requires covered entities to enact safeguards to protect “individually identifiable health information” when it is created, received, stored, or transmitted.

HIPAA does not apply to health technologies like apps, websites, and devices unless it is considered a “business associate” of a covered entity. Under the regulation, a business associate is defined as “a person [or entity] who creates, receives, maintains or transmits protected health information (PHI) on behalf of a covered entity or another business associate.” While this definition at first glance seems to be broad enough to cover teletherapy and other telehealth platforms, guidelines developed by the U.S. Department of Health and Human Services indicate that it’s much narrower in scope.

To be considered a business associate, an app or platform must be directly contracted by a healthcare provider for its services. Otherwise, any direct-to-consumer app or platform in which users input their personal health information is not considered a business associate, even if users input data provided by their healthcare provider, are directed to use the app by their healthcare provider, use it to send personal health data directly to their healthcare provider, or use it to access test results from their healthcare provider.

Consumers, on the whole, lack this awareness. Most consumers don’t distinguish between a message sent to their doctor via a hospital web portal and one sent in a telehealth platform; while these are two separate entities under HIPAA, the consumer is sending the same personal information. Dense, long, and vague privacy policies add more confusion and subterfuge to the mix, often not stating if a company is indeed a business associate under HIPAA.

Even the 9% of adults in the U.S. who claim to always read a company’s privacy policy will have trouble determining which of their data is protected and how. In the scenario that a consumer recognizes that teletherapy platforms are not covered by HIPAA, privacy policies are still not effective at clearly communicating how their data will be used. Clicking “accept” on one of these lengthy privacy policies is considered consent in the technology world despite the fact that consumers lack meaningful choice and control over how their data is used. This weak threshold for consent means that when your personal mental health information is shared or sold, there is little to no recourse for seeking redress when faced with the consequences.

The practice of sharing and selling personal data is not unique to teletherapy. The collection, sharing, and selling of user data for advertising is the lifeblood of today’s internet. But when the lack of privacy controls intersects with personal health data specifically, it gives way to numerous potential harms, such as discriminatory pricing for insurance coverage, reputational and financial harm, legal risks and potential prosecution by law enforcement, unwanted surveillance, and predatory or harmful advertising. Even setting these harms aside, the idea that private information about your mental health can be shared with countless third-party companies is spooky, to say the least.

The startling truth at the center of this mental health data privacy crisis is that there is no comprehensive federal law that regulates how a company uses consumer data. The most straightforward policy reform when it comes to protecting health data privacy is to update HIPAA to include teletherapy services (in addition to other telehealth services, wellness apps, digital addiction recovery services, and online pharmacies). But this would not holistically solve the issue, as HIPPA still allows data sharing with user consent. And, as mentioned above, the idea of consent holds virtually no meaning given the lack of consumer awareness and choice. If this myth of user consent persists, then this HIPAA update would not accomplish much. The policy must be revamped to include limitations on the amount and type of data health tech companies can collect and share. Mental health data in particular—as well as other sensitive health data related to substance abuse, reproductive health, and sexually transmitted diseases—should be prohibited from being shared with advertisers and other third parties.

In addition to overhauling HIPAA, there are other technological and policy reforms that can protect data privacy, including banning third-party cookies from browsers and websites, prohibiting the use of ad trackers, and requiring encryption for messages exchanged on telehealth apps and platforms that deal with sensitive health data. Another solution to fill in the gaps left by lackluster privacy policies is to require real-time notifications of data collection, usage, and tracking to users. Rather than checking the “agree” box when first using a website or app and never again encountering a notice, apps and websites could institute regular notifications that specifically tell users what data is being collected, when, and why, and provide an alternate user experience if the user chooses not to provide their data. The lack of consumer awareness and understanding of how their data is collected and used is a clear indication that there is also a need for more education on data privacy and security.

After being sanctioned by the Federal Trade Commission, BetterHelp defended its data-sharing practices as “industry-standard.” And they weren’t wrong: even 99% of hospital websites share patient data with advertisers. Our current health privacy laws place limited restrictions on only a small scope of healthcare entities, while teletherapy and other telehealth platforms face virtually no barriers despite having access to highly sensitive health information. Rather than continue with the status quo, we need to enact policy that loosens the grip of today’s data economy in order to preserve both healthcare access and privacy: two rights that ought to be fundamental and universal.

850 views
bookmark icon