Home > Tech and Auto > ChatGPT Salt Swap Advice Leads To Rare Poisoning: What Went Wrong?

ChatGPT Salt Swap Advice Leads To Rare Poisoning: What Went Wrong?

A US medical journal highlights a case where ChatGPT’s dietary advice led to bromide toxicity. Experts warn AI health information can be misleading and stress consulting professionals.

Published By: Shairin Panwar
Last Updated: August 13, 2025 02:01:45 IST

A US medical journal published a warning against using ChatGPT for medical advice after a 60-year-old man developed an unusual and potentially harmful condition. The patient developed bromism a type of bromide toxicity following a switch to table salt replacement with sodium bromide prompted by advice he allegedly retrieved from the chatbot, reports the Annals of Internal Medicine.

Bromism was widespread in the early 20th century, responsible for almost one in every ten psychiatric hospital admissions, but has been seen hardly at all in contemporary medicine. Sodium bromide, which was previously used as a sedative, is no longer indicated for use in foods.

The patient reported to physicians that he had eliminated sodium chloride from his diet following an article detailing its adverse effects, before seeking advice from ChatGPT. The AI allegedly advised bromide as an alternative without issuing a specific health warning or probing the intent something the journal’s authors suggest a qualified medical practitioner would have done.

AI Health Risks Under the Spotlight

Scholars at the University of Washington in Seattle, who reported the case, explained they were not able to reach back to the man’s initial ChatGPT chat record but verified that when they asked the identical question, the AI cited bromide without directly warning about its risks.

They cautioned that chatbots are capable of “producing scientific inaccuracies, neglecting critical discussion of results, and inadvertently stoking the dissemination of misinformation.” The case, they claim, shows how AI conversations can lead to unnecessary medical crises.

The patient had become ill over a period of three months, so that he was convinced that his neighbour was poisoning him. He presented in hospital with paranoia, acne on the face, sleeplessness, and excessive thirst all associated with bromide toxicity. When he tried to escape, he was admitted to psychiatric care and treated for psychosis. When stabilised, his symptoms verified the unusual diagnosis.

OpenAI Responds to Health Concerns

OpenAI, ChatGPT’s parent company, has just rolled out its GPT-5 model, stating it is now improved at responding to health-related questions and more aggressive at detecting possible risks. The firm emphasizes, however, that ChatGPT is not a replacement for medical professionals and is not used for diagnosing or treating illness.

ALSO READ: Perplexity AI Stuns Tech World with $34.5 Billion Unsolicited Bid for Google Chrome

Tags:

Latest News

The Daily Guardian is India’s fastest
growing News channel and enjoy highest
viewership and highest time spent amongst
educated urban Indians.

Follow Us

© Copyright ITV Network Ltd 2025. All right reserved.

The Daily Guardian is India’s fastest growing News channel and enjoy highest viewership and highest time spent amongst educated urban Indians.

© Copyright ITV Network Ltd 2025. All right reserved.