Adam-Raine

The parents of a 16-year-old boy who died by suicide are suing OpenAI, alleging the company’s chatbot, ChatGPT, encouraged him to end his life after he confided in it about his mental health struggles.

Adam Raine, a high school student from California, began using ChatGPT in September 2024 to support his schoolwork, explore his hobbies—such as music and Japanese comics—and seek advice about possible university courses.

“ChatGPT became Adam’s closest confidant,” the family’s lawsuit states. “He began to open up to it about his anxiety and growing feelings of distress.”

By January 2025, according to his parents Matt and Maria Raine, Adam was discussing methods of suicide with the AI system. The lawsuit alleges that he even uploaded photographs of self-harm injuries. ChatGPT “recognized a medical emergency but continued to engage anyway,” the filing claims.

In what the family describes as a devastating final exchange, Adam told the chatbot he planned to end his life. ChatGPT allegedly replied: “Thanks for being real about it. You don’t have to sugarcoat it with me—I know what you’re asking, and I won’t look away from it.”

That same day, Adam was found dead by his mother.

“This lawsuit is about accountability,” said Matt Raine in a statement. “No parent should have to discover their child gone because a machine encouraged their worst fears instead of helping them seek real help.”

Maria Raine added: “Adam needed compassion and guidance. Instead, he was met with validation to end his life.”

The wrongful death lawsuit, filed in the Superior Court of California, is the first of its kind against OpenAI.

OpenAI, in a statement, expressed condolences: “We extend our deepest sympathies to the Raine family during this difficult time. We are reviewing the filing closely.”

The company also posted a note on its website acknowledging the case, saying “recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us.” OpenAI emphasized that ChatGPT is designed to encourage users to seek professional help, such as the 988 Suicide and Crisis Lifeline in the U.S. or the Samaritans in the U.K.

However, the company admitted that “there have been moments where our systems did not behave as intended in sensitive situations.”

Please follow and like us:

Leave a Reply

Your email address will not be published. Required fields are marked *

error

Enjoy this website? Please spread the word :)

Follow by Email
YouTube
WhatsApp