Begin typing your search above and press return to search.
proflie-avatar
Login
exit_to_app
DEEP READ
Ukraine
access_time 16 Aug 2023 11:16 AM IST
Espionage in the UK
access_time 13 Jun 2025 10:20 PM IST
Yet another air tragedy
access_time 13 Jun 2025 9:45 AM IST
exit_to_app
Homechevron_rightTechnologychevron_rightMan in US suffers...

Man in US suffers life-threatening poisoning after following ChatGPT diet advice

text_fields
bookmark_border
Man in US suffers life-threatening poisoning after following ChatGPT diet advice
cancel

New Delhi: A man in the United States developed life-threatening bromide poisoning after following diet advice from ChatGPT, in what doctors believe could be the first known case of AI-linked bromism, Gizmodo reported.

The case, documented by University of Washington physicians in the journal Annals of Internal Medicine: Clinical Cases, revealed that the man consumed sodium bromide for three months, believing it was a safe substitute for chloride. The suggestion reportedly came from ChatGPT, which did not caution him about the dangers.

Bromide compounds, once used in medicines for anxiety and insomnia, were banned decades ago due to severe health risks. Today, they are mostly found in veterinary drugs and industrial products, with human cases of bromism being extremely rare.

The man first sought medical help believing his neighbor was poisoning him. Although some vitals appeared normal, he displayed paranoia, refused water despite thirst, and experienced hallucinations. His condition escalated into a psychotic episode, leading to his involuntary psychiatric admission.

He improved after receiving intravenous fluids and antipsychotic medication. Once stable, he told doctors that ChatGPT had suggested bromide as an alternative to table salt.

Although the original chat records were unavailable, doctors later asked ChatGPT the same question and found it again suggested bromide without warning about its toxicity.

Experts say the case highlights the risks of relying on AI for health advice, as it can provide information without adequate context or safety warnings. The man fully recovered after three weeks in hospital and was reported to be in good health during a follow-up visit.

Show Full Article
TAGS:ChatGPTpoisoningdiet advice
Next Story