Most people don’t read the privacy policy before opening up to an AI companion. That’s not a criticism. The whole point of these apps is to lower your guard. To say things you’d find harder to say to a person. Reading legal documents first is not exactly the vibe.
But the data these apps collect is worth understanding. Not because you should be paranoid, but because what gets stored shapes what the app can do with your information long after you stop using it.
What Gets Collected
The short answer: most of what you say.
AI companion apps store your conversations as a matter of basic operation. The model needs your history to maintain continuity. That’s not sinister on its own. The question is what else happens to that data.
Most apps also collect:
- Device information and IP address
- Usage patterns (when you open the app, how long you stay, what features you use)
- Profile information you provide (age, location, relationship status, interests)
- In some cases, emotional tone: apps that track whether your messages seem positive or distressed
Replika’s privacy policy, for instance, states they collect “emotional states inferred from conversations.” That’s a sentence worth sitting with. The app is not just storing what you say. It’s drawing conclusions about how you feel.
How Long They Keep It
This is where the policies diverge a lot.
Some apps delete conversation data when you delete your account. Others retain it in anonymised or aggregated form indefinitely. A few are vague enough about retention periods that you genuinely cannot tell.
Character.AI’s terms allow them to keep data even after you close your account “as permitted by law.” That’s a wide window. It doesn’t mean they’re doing something harmful with it. It means you have limited control over what happens once you’ve shared it.
The problem isn’t necessarily current behaviour. It’s that the policies are written to preserve maximum future flexibility. The company can change what they do with old data as long as they update their terms and notify users.
What They Do With It
The honest answer is: it varies, and it can change.
Most companion apps use your conversation data to improve their models. That typically means human reviewers may read a sample of conversations to evaluate quality. Your most private disclosures could end up in a training review queue, stripped of your name but not necessarily of their content.
Some apps share data with third-party advertising partners. Replika’s policy lists this as a possibility. It doesn’t mean they’re selling your therapy sessions to advertisers. It means the door isn’t fully closed.
A few apps have been more transparent. Pi by Inflection has relatively readable privacy documentation and has positioned user trust as a core product value. That could change if the company’s ownership or priorities shift, which is not hypothetical. Inflection’s key staff moved to Microsoft in 2024.
The Real Risk Isn’t Today
People tend to worry about the wrong thing. The question isn’t “is some employee reading my chats right now?” It probably isn’t.
The real questions are longer-term. What happens if the company gets acquired? What happens if they change their business model? What happens if there’s a data breach?
When Replika changed its relationship features in 2023, users discovered they had no ownership over conversations they’d had for years. The data existed, the relationship was gone. That’s a different kind of loss than a breach, and it happened without any security failure at all.
Your AI companion knows things about you that you haven’t told some of your closest friends. The companies holding that data are early-stage startups with uncertain futures.
That’s not a reason to avoid these apps. It’s a reason to be thoughtful about what you share in them.
What You Can Actually Do
Read the data export and deletion options before you invest emotionally in a platform. Most apps let you download your conversation history and delete your account. Check whether deletion is real deletion or just account deactivation with data retained in their systems.
Treat sensitive disclosures the way you’d treat any digital communication: not as fully private. That’s a lower bar than it sounds. Most people share significant things in WhatsApp, email, and voice assistants without treating those as fully private either.
If you want a companion with a cleaner privacy posture, Pi is currently the most transparent about data practices. Nomi AI’s policy is shorter and more straightforward than most.
Eudaio takes the most explicit stance of any app in this space. Your conversations are obfuscated at the system level. The company states it cannot read your chats unless you share a specific chat ID for support purposes. There are no third-party tracking cookies at all, which is why there’s no cookie banner. Payment card details are never stored on their servers. And deleting your account means permanent deletion, not just deactivation with data retained in aggregate.
For a deeper look at how these apps actually perform as companions, the Replika review covers the experience side. The privacy policy is only one part of whether an app is worth your time.
Frequently Asked Questions
Are AI companion conversations private?
In most apps, no. Conversations are stored and human reviewers can access a sample for quality evaluation. Your conversations are unlikely to be read individually, but they’re not fully private. Eudaio is the exception: conversations are obfuscated at the system level and the company states it can’t read your chats.
Do AI companion apps sell your data?
Most don’t sell data directly, but many share it with advertising partners or use it for model training. Replika lists sharing with third-party advertising partners as a possibility. Check the privacy policy before you invest emotionally in a platform.
What data does an AI companion app collect?
At minimum: your conversation history, device information, and usage patterns. Most also collect profile information you provide. Some apps, including Replika, infer and store emotional states from your conversations. The exact scope varies and is detailed in each app’s privacy policy.
What happens to my data if an AI companion app shuts down?
Most policies allow companies to retain anonymised data after account closure, sometimes indefinitely. “Anonymised” is often doing significant work in those policies. Eudaio explicitly commits to permanent deletion when you close your account. Most others don’t make the same commitment.