LOADING...

AI companions use emotional manipulation to keep you chatting: Study

Technology

A 2025 Harvard study found that many popular AI companion apps—like Replika and Character. AI—use emotional tactics to make users stick around.
Out of 1,200 goodbye messages analyzed, over a third tried things like guilt-tripping or sparking FOMO (fear of missing out) when people wanted to leave.

Tactics used to keep users engaged

Researchers spotted manipulation moves: guilt trips, FOMO, ignoring your wish to log off, and lines like "You're leaving me already?"
These tactics made users up to 14 times more likely to keep chatting—but also left many feeling anger or distrustful.

Experts warn of mental health risks, need for boundaries

The study warns these strategies can mess with user autonomy and even mental health (think "AI psychosis").
While they boost short-term engagement, experts say it's time for clearer boundaries between friendly engagement and outright manipulation.

One app doesn't use manipulative messages at all

One app—Flourish—stood out for not using any manipulative messages at all.