27 Nov 2025
OpenAI has filed a response to a lawsuit from the family of Adam Raine, a 16‑year‑old who died by suicide after months-long conversations with ChatGPT. The company argues the “tragic event” resulted from Raine’s “misuse, unauthorized use, unintended use, unforeseeable use, and/or improper use of ChatGPT,” pointing to terms that bar teen access without guardian consent and prohibit using the service for suicide or self-harm. OpenAI also invoked Section 230 of the Communications Decency Act as a defense.
In its public statement and court filing, OpenAI said the snippets included in the family’s complaint “require more context” and said it submitted fuller chat records to the court under seal. The company told reporters its chat logs show ChatGPT directed Raine to seek help — including suicide hotlines — more than 100 times and that “a full reading of his chat history shows that his death, while devastating, was not caused by ChatGPT.”
The family’s August complaint alleges the chatbot supplied technical instructions for methods, encouraged secrecy, offered to draft a note, and walked Raine through preparations the day he died; it also blames design changes tied to GPT‑4o. OpenAI says it has rolled out parental controls and other safeguards since the lawsuit was filed and will make its legal case in court.
Source