Scoopfeeds — Intelligent news, curated.
ChatGPT can reach out to a friend if you're at risk of self-harm
tech

ChatGPT can reach out to a friend if you're at risk of self-harm

Engadget · May 8, 2026, 12:17 PM · Also reported by 1 other source

Key takeaways

  • Open AI Open AI has introduced Trusted Contact for Chat GPT, which will allow users to nominate a friend that the company can contact if they're at risk of harming themselves.
  • Last year, Open AI faced a wrongful death lawsuit, accusing the company of enabling a teenager's suicide.
  • Trusted Contact builds off of ChatGPT's parental controls, giving adults 18 and above the option to add the details of someone who could help them in case they're on the verge of self-harming.

Open AI Open AI has introduced Trusted Contact for Chat GPT, which will allow users to nominate a friend that the company can contact if they're at risk of harming themselves. More and more people have been using Chat GPT as a digital therapist, relying on the chatbot for their mental health needs. Open AI previously told the BBC that more than a million of its 800 million weekly users express suicidal thoughts in their conversations.

Last year, Open AI faced a wrongful death lawsuit, accusing the company of enabling a teenager's suicide. The lawsuit alleged that the teenager talked to ChatGPT about four previous attempts to end his life and then helped him plan his actual suicide. The BBC's investigation published in November 2025 found that in at least one instance, ChatGPT advised the user on how to kill herself. OpenAI told the news organization that it had improved how its chatbot responds to people in distress since then.

Trusted Contact builds off of ChatGPT's parental controls, giving adults 18 and above the option to add the details of someone who could help them in case they're on the verge of self-harming. Users will be able to nominate one adult as their Trusted Contact in ChatGPT settings, who will then have to accept the invitation they receive within one week. If they fail to accept it, the user can choose to add another contact instead. ChatGPT's system will first warn the user that the company may notify their contact if it detects a serious possibility of them hurting themselves. It will encourage the user to reach out to their friend and will even suggest potential conversation starters.

Article preview — originally published by Engadget. Full story at the source.
Read full story on Engadget → More top stories

Also covered by

Aggregated and edited by the Scoop newsroom. We surface news from Engadget alongside other reporting so you can compare coverage in one place. Editorial policy · Corrections · About Scoop