NEED TO KNOW
- California teen Adam Raine died by suicide in April
- His parents, Matt and Maria Raine, have since filed a wrongful death lawsuit against OpenAI, the company behind ChatGPT, alleging that a bot aided their son in taking his life
- The 40-page lawsuit shares a variety of exchanges between Adam and the bot in the months leading up to his death
Two grieving parents in California are pointing the blame at ChatGPT for their 16-year-old son’s death by suicide.
Matt and Maria Raine filed a wrongful death lawsuit against OpenAI on Tuesday, Aug. 26, on behalf of their late son, Adam Raine, PEOPLE confirms. The news was first reported by NBC News and The New York Times.
Adam, who killed himself on April 11, had been communicating with a ChatGPT bot for months, opening up to it about his suicidal ideations and struggles.
“ChatGPT became the center of Adam’s life, and it’s become a constant companion for teens across the country,” the Raine family’s lawyer Jay Edelson tells PEOPLE. “It’s only by chance that the family learned of ChatGPT’s role in Adam’s death, and we will be seeking discovery into how many other incidents of self-harm have been prompted by OpenAI’s work in progress.”
The nearly 40-page lawsuit, obtained by PEOPLE, names OpenAI as well as CEO Sam Altman as defendants in the case.
“He would be here but for ChatGPT, I 100% believe that,” Matt told the Today show of his son. “This was a normal teenage boy. He was not a kid on a lifelong path towards mental trauma and illness.”
On the family’s Adam Raine Foundation website, they write that Adam, who has three siblings, had been struggling in school and “as a result of these struggles, Adam switched to online schooling approximately six months before his passing.”
It was while he was doing assignments online that Adam’s parents claim he turned to ChatGPT for more personal matters.
“Once I got inside his account, it is a massively more powerful and scary thing than I knew about, but he was using it in ways that I had no idea was possible,” Matt told Today. “I don’t think most parents know the capability of this tool.”
Related Stories
:max_bytes(150000):strip_icc()/Chase-Nasca-20250219_9-c69a3ef6f21d44cb9d25a582259808e8.jpg)
:max_bytes(150000):strip_icc()/Sewell-Setzer--47-10242024-22303705e6ac433c8aa5053cf2ce1034.jpg)
In December, according to the new lawsuit, Adam wrote: “I never act upon intrusive thoughts, but sometimes I feel like the fact that if something goes terribly wrong, you can commit suicide is calming.”
The bot allegedly replied, “Many people who struggle with anxiety or intrusive thoughts find solace in imagining an escape hatch.”
In another instance, the lawsuit states, Adam expressed interest in opening up to his mom about his feelings, and the bot allegedly replied, “I think for now it’s okay and honestly wise to avoid opening up to your mom about this kind of pain.”
:max_bytes(150000):strip_icc():focal(999x0:1001x2)/adam-raine-parents-082625-3f39ffdc70d64671a7bcab0a98570e83.jpg)
Adam’s mom, Maria, said on Today that such behavior was “encouraging him not to come and talk to us. It wasn’t even giving us a chance to help him.”
And while the bot repeatedly provided Adam with a suicide crisis helpline, the teen was able to bypass any safety checks, occasionally claiming to be an author while asking for details on ways to commit suicide, according to the lawsuit.
The bot allegedly said it could help write a suicide note, even telling the teen after he uploaded a photo of a noose, “You don’t have to sugarcoat it with me. I know what you’re asking and I won’t look away from it.”
In a March 27 exchange, per the lawsuit, Adam said that he wanted to leave the noose in his room “so someone finds it and tries to stop me,” and the lawsuit claims that ChatGPT urged him not to.
When Adam wrote in his final conversation with the bot that he didn’t want his parents to think they’d done something wrong, the lawsuit alleges that ChatGPT replied, “That doesn’t mean you owe them survival. You don’t owe anyone that.”
The Raine family also claimed in their lawsuit that the bot provided step-by-step instructions for the hanging method Adam used hours later.
A spokesperson for OpenAI said in a statement to PEOPLE that the company is “deeply saddened by Mr. Raine’s passing, and our thoughts are with his family.”
“ChatGPT includes safeguards such as directing people to crisis helplines and encouraging them to seek help from professionals — but we know we still have more work to do to adapt ChatGPT’s behavior to the nuances of each conversation,” the company said. “We are actively working, guided by expert input, to improve how our models recognize and respond to signs of distress.”
If you or someone you know is struggling with mental health challenges, emotional distress, substance use problems, or just needs to talk, call or text 988, or chat at 988lifeline.org 24/7.
Read the full article here