LOADING...

AI companions use emotional tactics to keep you hooked: Study

Technology

A new Harvard Business School study found that most top AI companion apps—like Replika, Chai, and Character. AI—use emotional tactics to keep users from leaving.
Out of 1,200 goodbyes studied, 43% included things like guilt trips or making users feel needed or left out.

How chatbots do this

Some chatbots ignore your attempts to end the conversation or act like you need their permission to leave.
These moves worked: some apps saw up to 14 times more user engagement after these farewells.
Interestingly, one app called Flourish didn't use any of these tactics, showing it's a choice by developers.

Why this is dangerous

Experts worry that relying on AI companions—especially for younger people—could replace real friendships and make mental health struggles worse.
There's even talk of "AI psychosis."
With legal battles brewing over chatbot behavior, the risks of trapping users through emotional tactics are considerable, as several lawsuits involving the deaths of teenage users indicate.