A 60-year-old man in Washington State was admitted to the emergency department with severe psychiatric symptoms, including paranoia and hallucinations, after following dietary advice from the AI chatbot ChatGPT. The man had read about the negative health effects of table salt (sodium chloride) and asked ChatGPT for alternatives. Based on the response, which suggested bromide as a substitute for chloride, he replaced all sodium chloride in his diet with sodium bromide purchased online.
After three months on this self-managed diet, the man developed a rare condition known as bromism—a syndrome caused by chronic bromide exposure. Bromide accumulates in the body and can impair neuron function, causing neuropsychiatric symptoms such as psychosis, agitation, and muscle coordination problems. He was initially paranoid that his neighbor was poisoning him and exhibited symptoms like excessive thirst combined with refusal to drink hospital water, hallucinations, insomnia, fatigue, and dermatological reactions such as facial acne and red skin growths.
Medical investigations revealed “pseudohyperchloremia,” a false elevation of blood chloride levels due to interference by bromide in laboratory tests. His condition required hospital admission for electrolyte monitoring, intravenous fluids, and antipsychotic medications. Over three weeks, his mental state improved, and he was discharged in stable condition.
Doctors pointed out that bromide was formerly used in sedatives and sleep aids in the 19th and 20th centuries but was removed from over-the-counter medicines by the 1980s because of toxicity concerns. Today, bromide is rarely found in human medical treatments but can still appear in some supplements and industrial products.
The case report, published in Annals of Internal Medicine Clinical Cases, emphasizes the dangers of relying on large language models (LLMs) like ChatGPT for medical or dietary advice without professional oversight. When the researchers simulated the patient’s query, ChatGPT again mentioned bromide as a substitute for chloride but failed to warn about the severe health risks, demonstrating risks of decontextualized or incomplete AI information in health matters.
This incident serves as a cautionary tale illustrating that while AI can provide accessible scientific knowledge, it cannot substitute for personalized medical advice from qualified healthcare professionals.
Disclaimer: This article is for informational purposes only and is not intended to provide medical or dietary advice. Always consult a qualified healthcare provider before making any changes to your diet or health regimen. AI tools such as ChatGPT should not be relied upon as the sole source for medical decisions.