Health
Medical Journal Warns Against Using AI for Health Advice

SEATTLE, Wash. — A recent article published in the Annals of Internal Medicine has raised alarms about using AI applications like ChatGPT for health information. The study centered on a 60-year-old man who developed bromism, or bromide toxicity, after following dietary advice he found online.
The patient consulted ChatGPT about eliminating table salt from his diet after reading about sodium chloride’s negative effects. He began using sodium bromide over a three-month period, despite the warnings that chloride can be substituted with bromide under certain circumstances. Bromide was once used as a sedative in the early 20th century, and bromism was a noted condition that contributed to many psychiatric admissions at that time.
The authors of the article, who are affiliated with the University of Washington, emphasized the potential risks associated with AI-generated health information. They noted, “This case highlights how the use of artificial intelligence can potentially contribute to preventable adverse health outcomes.” However, they were unable to access the patient’s conversation history with ChatGPT to review the specific advice given.
When the authors consulted ChatGPT themselves regarding possible replacements for chloride, they found bromide included in the response, with no specific health warnings or inquiries about the purpose behind the question. They expressed concerns that ChatGPT and similar AI could generate scientific inaccuracies and propagate misinformation.
In response, the company behind ChatGPT stated that its latest version, GPT-5, is designed to be better at addressing health questions and flagging potential concerns. Nonetheless, it reiterated that the application is not a substitute for professional medical advice.
Published shortly before the launch of GPT-5, the article cautioned that while AI could facilitate communication between scientists and the public, it risks delivering out-of-context information. It would be unusual for a medical professional to recommend sodium bromide as a salt alternative.
According to the article, the man eventually sought treatment at a hospital, fearing poisoning by a neighbor and exhibiting multiple dietary restrictions. He was highly paranoid about the water he was given and attempted to leave the hospital within a day of his admission. After being sectioned, he was treated for psychosis before reporting other symptoms consistent with bromism, such as excessive thirst and insomnia.
The findings highlight the need for caution when using AI for health-related inquiries.