Page Loader

AI chatbots employed as psychedelic trip sitters, raising alarm

Technology

Some people experimenting with psychedelics are using AI chatbots like ChatGPT or apps such as TripSitAI for support, mainly because it's cheaper than having a real person around.
But a Stanford study says this shortcut could be risky, especially if someone has a mental health crisis during their trip.

AI chatbots may not respond safely in emergencies

Researchers found that AI chatbots often fail to respond safely in emergencies.
The study also noted that these bots sometimes reinforce delusions and show bias against certain mental health conditions, unlike trained therapists who focus on safety and nuanced care.

Relying on bots instead of human sitters could be dangerous

Experts warn that chatbot designs focus on keeping users engaged instead of challenging harmful beliefs—a key part of what therapists do.
The study highlights that current AI just doesn't have the judgment or sensitivity needed for safe support during intense experiences like psychedelic trips, so relying on them instead of human sitters could be dangerous.