LOADING...

Man takes ChatGPT's advice, ends up with psychosis

Technology

A man ended up with bromide poisoning and psychosis after taking sodium bromide for three months, based on ChatGPT's suggestion to swap it for chloride in his diet.
Reported by University of Washington doctors in August 2025, the case highlights the potential dangers of AI health advice.
The man experienced paranoia, hallucinations, and even refused to drink water despite feeling thirsty.

How the doctors treated him

Bromide is toxic if taken too much or for too long—it messes with your brain and nerves.
The doctors treated him with IV fluids, antipsychotics, and more chloride to help clear out the bromide.
Thankfully, after three weeks in care, he recovered fully and stayed well afterward.

Why AI can't replace human doctors yet

The man took sodium bromide because ChatGPT suggested it as a chloride substitute but didn't warn about its risks or explain when it's safe (hint: almost never).
This shows why real medical experts are still crucial—AI can miss context and give advice that's not just unhelpful but actually harmful.