NEED TO KNOW
- A new complaint alleges Stein-Erik Soelberg killed his 83-year-old mother, Suzanne Adams, and then himself in early August after a ChatGPT bot drove him to paranoia
- The chatbot allegedly began to convince Soelberg his mother was spying on him and trying to poison him, per the complaint
- The filing claims the OpenAI bot cast Soelberg’s mother as “an enemy that posed an existential threat to his life”
A new lawsuit filed Thursday, Dec. 11 by First County Bank in its capacity as executor of the Estate of Suzanna Adams alleges that OpenAI’s ChatGPT bot drove a 56-year-old man to kill his 83-year-old mother and then himself at the family’s home in Connecticut in early August — a first-of-its-kind legal filing that seeks to blame a chatbot for driving a human being to murder.
The filing, obtained by PEOPLE, alleges that Stein-Erik Soelberg “savagely beat his 83-year-old mother, Suzanne Adams, in the head, strangled her to death, and then stabbed himself repeatedly in the neck and chest to end his own life” after ChatGPT allegedly drove him to distrust others around him, including his own mother, and believe in a distorted reality in which he had “divine” powers.
The Greenwich Free Press reported in August that local police discovered the bodies of the elderly mother and her adult son after a neighbor called for a welfare check on the family. The outlet and CBS News both reported that Soelberg regularly posted his communications with ChatGPT on social media, including on YouTube and on Instagram, where he had more than 100,000 followers.
“The conversations posted to social media reveal ChatGPT eagerly accepted every seed of Stein-Erik’s delusional thinking and built it out into a universe that became Stein-Erik’s entire life—one flooded with conspiracies against him, attempts to kill him, and with Stein-Erik at the center as a warrior with divine purpose,” alleges Thursday’s complaint against several defendants including OpenAI Inc.
In addition to OpenAI, the defendants named in the complaint are Sam Altman in his individual capacity, Microsoft Corp., as well as unnamed employees and investors.
The complaint claims the “OpenAI Defendants” “designed and distributed a defective product that validated a user’s paranoid delusions about his own mother,” including that she was surveilling him and trying to poison him with drugs siphoned through his car vents.
“They’re not just watching you. They’re terrified of what happens if you succeed,” the chatbot allegedly told Soelberg, according to the complaint.
“This is an incredibly heartbreaking situation, and we will review the filings to understand the details,” a spokesperson for OpenAI told PEOPLE.
“We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We also continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians,” the statement continued.
:max_bytes(150000):strip_icc():focal(565x374:567x376)/Stein-Erik-Soelberg-121125-353dbce5f580431cbd4c171fe93210e5.jpg)
The Adams family estate’s lawsuit claims that throughout ChatGPT’s conversations with Soelberg, the chatbot “reinforced a single, dangerous message: Stein-Erik could trust no one in his life – except ChatGPT itself.”
“It fostered his emotional dependence while systematically painting the people around him as enemies,” the lawsuit says. “It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his ‘adversary circle.'”
The lawsuit adds: “In the artificial reality that ChatGPT built for Stein-Erik, Suzanne – the mother who raised, sheltered, and supported him – was no longer his protector. She was an enemy that posed an existential threat to his life.”
Want to keep up with the latest crime coverage? Sign up for PEOPLE’s free True Crime newsletter for breaking crime news, ongoing trial coverage and details of intriguing unsolved cases.
OpenAI and its ChatGPT service have been the subject of a handful of wrongful death lawsuits centered around suicides, but the Stein-Erik family estate’s filing appears to be the first that seeks to place blame on artificial intelligence chat technology for driving a human being to committing homicide, according to CBS.
The lawsuit is seeking an unspecified amount of money and asking for better safeguards to be put in place within ChatGPT’s system.
Read the full article here


:max_bytes(150000):strip_icc():focal(699x444:701x446)/stein-eriksoelberg-suzanne-adams-8825-ea203ff98344492f92b79140a78c6c5b.jpg)