
OpenAI has officially restricted ChatGPT from offering medical, legal, or financial advice, following alarming incidents where users suffered harm after relying on its guidance. The chatbot is now designated solely as an educational tool, emphasizing general principles and professional referrals.
Read more: OpenAI Launches ChatGPT Atlas Browser to Compete with Chrome
OpenAI has announced a sweeping change to ChatGPT’s functionality, barring it from dispensing specific advice on health, law, or finance. Effective October 29, the chatbot is now classified as an “educational tool,” limited to explaining general concepts and urging users to seek professional help.
“As of October 29, ChatGPT stopped providing specific guidance on treatment, legal issues and money,” reported NEXTA via News18. The updated terms prohibit the bot from naming medications, offering legal strategies, or giving investment recommendations.
The move follows troubling cases of users harmed by ChatGPT’s advice. In August, a 60-year-old man was hospitalized after replacing table salt with sodium bromide, citing ChatGPT as his source. He suffered paranoia and hallucinations and was placed on a psychiatric hold.
Another case involved Warren Tierney, 37, from Ireland, who delayed seeing a doctor after ChatGPT downplayed his symptoms. “It sounded convincing… but ultimately I take full responsibility,” Tierney told Mirror.