Man develops rare condition after ChatGPT query over stopping eating salt

The Guardian •

A US case report warns against using ChatGPT for medical advice after a 60-year-old man, prompted by the AI to swap table salt for sodium bromide, developed bromism (bromide toxicity) and psychosis. Researchers found earlier ChatGPT responses offered no health warnings or context. The incident highlights AI’s potential to spread dangerous misinformation, even as OpenAI’s GPT-5 upgrade promises improved health guidance.

Read original ↗