A 60-year-old man was hospitalized for three weeks because of following incorrect advice from ChatGPT, highlighting the potential dangers of relying on AI for medical information. The man asked ChatGPT how to replace salt in his diet. The AI suggested using sodium bromide, a substance now considered toxic. He subsequently purchased sodium bromide and consumed it, without seeking professional medical advice, resulting in a series of severe symptoms. The man experienced significant distress, including fear, confusion, and extreme thirst, leading to hospitalization. Doctors worked to stabilize his condition and restore his health. The case, published in the American College of Physicians journal, emphasizes the importance of consulting medical professionals for health and nutrition advice and avoiding reliance on AI in place of expert guidance.
breaking
- 73% Domestic Fertilizers: India’s Path to Agri Self-Sufficiency
- Assam CM Criticizes Mamata Banerjee Over ED Operation Drama
- India Stands Firm on Shaksgam Valley Against China Moves
- IOA’s Financial Windfall for Indian Sports Federations
- SC to Women: FIR Against Abuse for Feeding Street Dogs
- BSE Rolls Out Derivatives Stock Index Revolutionizing Passive Plays
- Oscar Buzz: Anupam Kher’s ‘Tanvi the Great’ on Shortlist Glory
- Hindi: Emotional Bridge for Over 1 Billion Indians
