LOADING...

Man follows AI's diet advice, ends up in hospital

Technology

A 60-year-old man landed in the hospital after following an AI chatbot's advice to swap out table salt for sodium bromide.
He was trying to eat healthier, but instead ended up with paranoia and hallucinations, needing medical treatment for high bromide levels.
The case, reported this year in a medical journal, is a reminder that AI health tips can go seriously wrong if you skip checking with real doctors.

ChatGPT suggested sodium bromide without warnings

Sodium bromide isn't meant for food—it's mostly used in veterinary meds these days and can be risky if consumed.
When doctors tried asking ChatGPT similar questions, it still suggested sodium bromide without any warnings.
OpenAI has since clarified that ChatGPT isn't made for giving medical advice and urges everyone to talk to healthcare professionals about health concerns.
Bottom line: don't trust chatbots over your doctor when it comes to your health.