A 60-year-old man was hospitalized for three weeks because of following incorrect advice from ChatGPT, highlighting the potential dangers of relying on AI for medical information. The man asked ChatGPT how to replace salt in his diet. The AI suggested using sodium bromide, a substance now considered toxic. He subsequently purchased sodium bromide and consumed it, without seeking professional medical advice, resulting in a series of severe symptoms. The man experienced significant distress, including fear, confusion, and extreme thirst, leading to hospitalization. Doctors worked to stabilize his condition and restore his health. The case, published in the American College of Physicians journal, emphasizes the importance of consulting medical professionals for health and nutrition advice and avoiding reliance on AI in place of expert guidance.
breaking
- US President Trump Threatens Iran Over Protest Suppression
- Heartbreak for India: PV Sindhu Loses Malaysia Open Semifinal Thriller
- Tragedy in Mumbai: Midnight Fire Claims Three Lives in Goregaon West
- Why Trump Says US Must Control Greenland Now
- World Book Fair 2026 Starts Saturday: Pradhan to Launch Mega Event
- WPL 2024 Match Preview: Tactical Battles Await in Today’s Double Header
- Ahead of Youth Leaders Event, PM Modi Voices Keenness to Connect
- Breaking: Trump to Meet Venezuelan Reps Soon Amid Thaw
