top of page

Your Mental Health Data Matters: A Guide to Therapy App Privacy

  • Writer: James Colley
    James Colley
  • Nov 22
  • 13 min read

When you share your struggles with a therapy app, you trust that your most vulnerable moments stay private. This guide explains what data therapy apps collect, why most aren't protected by HIPAA, how your mental health information gets shared with tech companies, and what you can do to protect yourself.


Woman with glasses looks intently at her smartphone indoors, wearing a beige top. Background is blurred, creating a focused, concerned mood.

What Data Do Therapy Apps Actually Collect?

Most therapy apps collect three types of information: your personal health details, how you use the app, and technical data from your device. Understanding what gets gathered is the first step to protecting yourself.


Personal Health Information and Mental Health Data

Before you start therapy, most apps ask you to complete an intake questionnaire. This form collects sensitive details about your mental health history, current symptoms, and personal circumstances.


Here's what these questionnaires typically ask:

  • Whether you've experienced suicidal thoughts or self-harm

  • Details about panic attacks, phobias, or trauma

  • Your current medications and past diagnoses

  • Personal information like gender identity and sexual orientation

  • Sleep patterns, relationship status, and life stressors


This information matters because it reveals your vulnerabilities. When you share that you're struggling with depression or anxiety, you're trusting the app to protect that information. Your therapy conversations—every message you send and every session you complete—also become part of your data profile.


Usage Patterns and Behavioral Metadata

Metadata is data about your data. It tracks when you log in, how long your sessions last, and how often you message your therapist.


Why this matters: If you're logging in at 2 a.m. several times a week, that pattern suggests sleep disruption and emotional distress. If your message frequency suddenly spikes, it might signal a crisis. This behavioral information is valuable to companies because it reveals your mental state without needing access to your words.


Location data adds another layer. When you access the app from home, work, or other locations, that geographic information connects to your usage patterns. Combined with other data, this reveals specific details about your life and routines.


Device Identifiers and Technical Information

Your phone has unique codes that distinguish it from every other device. These identifiers aren't your name or email, but they're permanently linked to your specific phone. When you use a therapy app, it can access these codes.


Think of device identifiers like a fingerprint for your phone. When a therapy app shares this identifier with Facebook or Google, those companies connect your mental health app usage to everything else they know about you. They might not see your therapy conversations, but they know you're using a mental health app—and that information alone has value.


Privacy-first alternatives exist: Some therapy platforms are architecting their systems specifically to avoid collecting this data in the first place. Rather than gathering device identifiers for analytics or advertising purposes, privacy-first platforms process app functionality data locally on your device, meaning it never needs to be transmitted to servers. They explicitly do not integrate device identifiers with advertising networks, and they do not use your location data to build behavioral profiles. These aren't just privacy policies—they're engineering decisions that fundamentally change what data exists to be shared.



Why HIPAA Doesn't Protect Most Therapy App Users

HIPAA is a federal law that protects medical information, but it only applies to specific organizations called "covered entities." Most therapy apps don't qualify, which means your mental health data may have fewer legal protections than you think.


The Covered Entity Loophole

HIPAA protects health information held by three types of organizations: health insurance companies, healthcare providers who transmit health information electronically, and companies that process health information for these providers. If your therapy app doesn't fall into one of these categories, HIPAA doesn't apply—even if licensed therapists use the platform.


The gap this creates: When you visit a traditional therapist's office, HIPAA prevents them from sharing your information without explicit consent. When you use a standalone therapy app, that legal protection often doesn't exist. The app can legally share your mental health information with advertisers and data brokers.


What's changing: The regulatory landscape is shifting. The FTC's updated Health Breach Notification Rule finalized April 26, 2024 expands what qualifies as protected health information and increases enforcement accountability for companies handling mental health data, signaling a broader shift toward treating all consumer mental health information with heightened protection. Forward-thinking therapy platforms are already exceeding these emerging standards by adopting a core principle: treat every piece of user-inputted data as sensitive health information worthy of maximum protection, regardless of technical classifications or regulatory loopholes. This represents a move from compliance-based privacy (meeting minimum legal requirements) to evidence-based privacy (protecting data according to its actual sensitivity and potential for harm).


What This Means for Your Mental Health Data

Without HIPAA protection, therapy apps can legally sell or share your mental health information. This isn't a violation—it's often explicitly allowed in the privacy policy you agreed to when signing up, as demonstrated when the FTC fined BetterHelp $7.8 million for sharing consumers' health data with Facebook and Snapchat.


Your mental health data can be used to target you with advertisements. If you've disclosed anxiety in a therapy app, that information might influence which ads appear on your social media feeds. Data brokers can purchase this information and combine it with your shopping habits, location history, and social media activity to create comprehensive profiles.


The enforcement problem: Traditional healthcare providers face significant penalties for violating HIPAA. Therapy apps that fall outside these regulations aren't subject to the same enforcement. You're relying primarily on the app's own privacy commitments rather than federal law.


Green background with white text: "Experience therapai, No Commitment Required." A button below reads "Get Early Access."

How Therapy Apps Share Your Data with Tech Companies

Many therapy apps integrate with advertising and analytics platforms. This means your mental health information flows to multiple companies beyond the app developer.


Who Receives Your Data

Five main types of companies receive data from therapy apps:

  • Social media platforms: Facebook, Snapchat, and Pinterest receive identifiers and behavioral data to build profiles and optimize advertising

  • Advertising networks: Google and similar companies use your data to serve targeted ads across websites

  • Analytics companies: These firms track how you use the app to help developers understand engagement

  • Data brokers: Companies that aggregate information from multiple sources and resell it to marketers and insurers

  • Researches: Some apps share "de-identified" data with academic or commercial researchers, though this data can often be re-identified


When a therapy app shares your device identifier with Facebook, Facebook doesn't necessarily see your therapy conversations. But Facebook learns that you're using a mental health app, when you use it, and how frequently. This information gets added to everything else Facebook knows about you.


How privacy-first apps prevent this: Leading therapy platforms are adopting architectural approaches that create what experts call a 'wall of separation' between personal session data and third-party integrations. Rather than relying on policy alone, these platforms use technical controls—data silos that make it impossible for personal conversations to enter tracking systems, strict access controls that limit who can see what data, and on-device processing that keeps sensitive information from ever leaving your phone. This represents the evolution of privacy-by-design: building privacy into the system's foundation rather than attempting to add it later.

The data flow happens invisibly. When you see a Facebook tracking pixel mentioned in a privacy policy, it means the app sends information to Facebook every time you use certain features. These connections happen in the background without notification.


Why Companies Want This Data

Mental health information is valuable to advertisers because it reveals vulnerability and predicts behavior. If a company knows you're struggling with anxiety, they can target you with ads for medications, self-help books, or wellness products.


Beyond obvious targeting: Companies use mental health data to infer financial stress, relationship problems, or major life changes. These inferences help them predict what you might buy and when you're most susceptible to advertising. Your mental health struggles become data points in a commercial system.


Data brokers pay for mental health information because it enriches the profiles they sell. When they combine therapy app data with your shopping history and social media activity, they create a comprehensive picture of who you are. This profile is then sold to marketers, employers, and insurers.


The value isn't just in what you explicitly share. The fact that you use a mental health app at all signals that you're experiencing psychological distress. Even if the app never shares your specific symptoms, your app usage reveals something companies want to know.


What Happens When Your Mental Health Data Gets Breached or Misused

Mental health data breaches can affect your employment, finances, and personal safety. Understanding these risks helps you evaluate how much information you're comfortable sharing.


Discrimination and Employment Risks

Mental health information can influence hiring decisions, even though this discrimination is often illegal. If your therapy app data reaches data brokers, it might be purchased by companies that screen job candidates. A potential employer might never tell you that your mental health history affected their decision.


Insurance implications: While health insurers can't legally deny coverage based on pre-existing mental health conditions, they might use therapy app data to adjust premiums in ways that are difficult to detect. Life insurance and disability insurance companies face fewer restrictions and might use mental health data more directly.


Lending decisions could also be affected. Some financial services companies use alternative data beyond traditional credit reports to assess loan applications. Mental health data might signal financial instability, leading to higher interest rates or denials. The algorithms that make these decisions are often opaque.


Identity Theft and Financial Fraud

Therapy apps collect the building blocks of identity theft: your full name, date of birth, email, phone number, and payment information. Combined with personal details you share in therapy—your address, family members' names, workplace, daily routines—this creates a rich profile that criminals can exploit.


Vulnerability targeting: Criminals who know you're experiencing depression, anxiety, or financial stress can tailor scams to exploit your vulnerability. They might pose as mental health services offering help, or use details from your therapy conversations to make impersonation more convincing.


The downstream risk is particularly concerning. Once your mental health data enters the data broker ecosystem, you lose control over where it goes. A breach at a data broker could expose information that originated from your therapy app, even if the app itself has strong security.


AI Model Training and Data Use

One of the most significant concerns with AI-powered therapy apps is whether your conversations are being used to train the company's underlying AI models without your knowledge or consent. This is a legitimate concern: your therapy data reveals patterns, vulnerabilities, and human psychology that are incredibly valuable for training large language models.


How the best platforms prevent this: Privacy-first therapy platforms use architectural approaches to create an absolute separation between your personal session data and model training processes. Rather than all user data flowing into a single training pipeline, these platforms maintain strict data silos—personal conversations remain isolated and inaccessible to AI training systems unless you explicitly opt in. Some platforms go further by running AI features on your device rather than centralized servers, meaning your data never needs to be sent anywhere for processing. When users do choose to contribute anonymized data to research that could improve mental health care, the process involves clear, specific consent—not a checkbox buried in a 40-page Terms of Service. You should be able to understand exactly what data you're contributing, why it's being used, and how it will be protected.


Why this matters: The difference between 'opt-out' (your data is used unless you find and click a hidden setting) and 'opt-in' (your data is only used when you actively choose) fundamentally changes who controls your information. Therapy apps that rely on opt-out consent often claim they're 'anonymizing' data, but this term is frequently misused.



How to Protect Your Privacy When Using Therapy Apps

No single step completely protects your privacy, but combining strategies significantly reduces your risk. These practical measures give you more control.


Review Privacy Policies Before Signing Up

Privacy policies are dense, but you don't need to understand every legal detail. Focus on answering specific questions:

  • What data is collected: Look for comprehensive lists beyond vague terms like "health information"

  • Who receives data: Check for third-party sharing with advertising networks or data brokers

  • Data retention: Find out how long the app keeps your information after account deletion

  • Your rights: Confirm whether you can access, delete, or download your information

  • Red flags: Be wary of vague language like "we may share data with partners"


If a privacy policy doesn't clearly answer these questions, that's a warning sign. Apps serious about privacy make their practices easy to understand.


Limit Data Sharing in App Settings

Most apps have privacy controls buried in settings menus and enabled by default. After creating an account, adjust these settings:

  • Disable analytics: Turn off options for "help us improve the app" or "send usage data"

  • Limit social connections: Don't link your therapy app to Facebook or Google

  • Restrict location access: Disable location permissions unless the app needs them to function

  • Opt out of marketing: Look for email preferences and opt out of promotional communications

  • Review connected apps: Check which services have access and revoke unnecessary permissions


These settings don't prevent all data sharing, but they reduce information flowing to third parties. Think of each setting as a layer of protection.


Choose Apps with Strong Privacy Commitments

Not all therapy apps handle data the same way. Look for these signals:

  • Transparent practices: Apps that clearly explain what they collect and why

  • Limited third-party sharing: Few or no integrations with advertising companies

  • Data minimization: Apps that collect only what's necessary for therapy—and this is reflected in their architecture, not just their policy

  • Privacy certifications: Third-party audits from recognized organizations (look for SOC 2 Type II certification, HIPAA compliance if applicable, or independent privacy audits)

  • Clear deletion policies: Easy account deletion with explicit confirmation of data removal

  • Licensed therapists: Regulated mental health professionals often have stronger privacy obligations

  • On-device processing: Features that process your data locally on your phone rather than sending it to servers. This is a technical signal that the company has invested in privacy-by-design

  • Granular user controls: You should be able to see exactly what data is collected about you, who can access it, and control whether your data is used for research or analytics. Look for apps that make this easy to understand and manage

  • Architectural transparency: Ask the app directly: How is user data separated from model training? What happens to my data if the company is sold? Can they technically access my therapy conversations? If they're evasive, that's a warning sign



How therappai Approaches Therapy App Privacy Differently

At therappai, we built our platform with privacy as a core design principle. Every technical decision prioritizes protecting your mental health data.


Transparent Data Collection and Storage

We collect only the information necessary to provide therapy and improve your experience. This includes your account details, therapy conversations, and basic app functionality data. We don't collect behavioral metadata for advertising, and we don't share your information with data brokers or advertising networks.


How we protect your data: Your therapy data is encrypted both when it travels between your device and our servers and when it's stored in our systems. Even if someone intercepted your data, they couldn't read it without encryption keys. We conduct regular third-party privacy audits to verify our practices match our commitments.

You have complete access to your data. You can view everything we've collected, download it in a standard format, or request deletion at any time. When you delete your account, we remove your therapy data from our active systems within 30 days.


Privacy-First Architecture and Design Decisions

We made technical choices that cost more and take longer but better protect your privacy. Some data processing happens on your device rather than our servers, which means sensitive information never leaves your phone.


What we don't do: We don't use device identifiers for cross-app tracking. We don't integrate with advertising platforms. We don't sell or share your data for commercial purposes. These aren't just policy commitments—they're built into how our systems work.


How We Handle Anonymization for Improvement and Research

When we use data to improve therappai or contribute to mental health research, we apply a rigorous, multi-step de-identification process that makes re-identification computationally infeasible. Here's how it works:


Step 1: Direct identifier removal – We strip all obvious identifying information: names, email addresses, phone numbers, dates of birth, specific locations, and other HIPAA identifiers.


Step 2: Quasi-identifier reduction – We remove or generalize details that could theoretically identify someone when combined with other data. For example, we might aggregate age into ranges rather than specific birth dates, and we remove highly specific personal details that could be cross-referenced with public databases.


Step 3: Differential privacy application – We add mathematical noise to aggregate data patterns in a way that protects individual privacy while preserving overall patterns. This means that even if someone knows your general characteristics, they cannot determine whether your specific data was included in a dataset or what your specific values were.


The combination of these steps means that the de-identified data we use for research is technically and practically impossible to trace back to individuals. If therappai is ever acquired, these de-identification safeguards travel with the data—your specific therapy information cannot be re-identified even by a new owner.



Get Early Access to Privacy-First Therapy

Your mental health data deserves protection. therappai is building therapy that puts your wellbeing and data privacy first. When you're ready, explore what privacy-first mental health support looks like.


Three smiling people stand close on a pink background. Text on green reads "Experience therapai. No Commitment Required. Get Early Access."


Frequently Asked Questions

Can my employer see what I discuss in my therapy app?

Your employer can't directly access your therapy app data. However, if the app shares information with data brokers or advertising networks, that data could be purchased by background check companies or screening services. This risk is higher with apps that have weak privacy policies or share broadly with third parties.

Does BetterHelp sell user data to advertisers?

BetterHelp has faced regulatory scrutiny for sharing user data with advertising platforms like Facebook and Snapchat. While the company states it doesn't "sell" data in the traditional sense, it has shared email addresses, IP addresses, and intake questionnaire information with third parties for advertising purposes.

Which therapy apps have the strongest privacy protections in 2025?

Apps with strong privacy practices typically have transparent policies, minimal third-party sharing, and clear data deletion procedures. Look for platforms that undergo third-party privacy audits, employ licensed therapists who have stronger confidentiality obligations, and explicitly state they don't share data with advertisers or data brokers.

What happens to my therapy app data if the company gets sold?

When a therapy app company is acquired, your data often transfers to the new owner as a business asset. Privacy policies typically include language allowing this, though some jurisdictions require user notification. This is why choosing apps with strong privacy commitments matters—those protections should survive ownership changes.

Can therapy apps share my information with my health insurance company?

Some therapy apps that work directly with insurance companies may share information necessary for billing and coverage decisions. Apps that don't involve insurance typically don't share data with insurers directly, but data brokers could sell mental health information to insurance companies if the app shares data with brokers.

How can I tell if my therapy app is actually HIPAA compliant?

HIPAA compliance only applies to covered entities and their business associates. Ask the app directly whether they're a covered entity or business associate, and request documentation. Be aware that even HIPAA-compliant apps may only protect certain types of data, like therapy session content, while using other data for marketing or analytics.

Is my therapy conversation data being used to train AI models?

This is a legitimate concern. At poorly designed therapy apps, user conversations could theoretically flow into AI model training pipelines without explicit consent.

At therappai, this is architecturally prevented. We maintain strict data silos that separate your personal therapy sessions from any model training process. Your conversations remain in a protected space that AI training systems cannot access. If you want to contribute anonymized conversation patterns to research that could improve mental health care, you can opt in—but the process involves clear, specific consent, not buried policy language. You should always understand what data you're sharing, why it's being used, and how it will be protected. We never use opt-out consent for model training; it's always opt-in with clear communication.


Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
SSL Secure
GDPR Audited
SOC2 Audited
HIPAA Compliant

© 2025 by therappai - Your Personal AI Therapist, Always There When You Need It.

bottom of page