ChatGPT Had Role In Deaths, Suits Say
BY JULIE JARGON AND SAM SCHECHNER
Families in the U.S. and Canada are suingOpenAI , alleging that loved ones have been harmed by interactions they had with the artificial-intelligence company’s popular chatbot, ChatGPT. Four of them died by suicide following the interactions.
The seven lawsuits, filed in state courts in California on Thursday, claim people have been driven into delusional states, at times resulting in suicide, after engaging in lengthy chat sessions with the bot. The complaints contain wrongful death, assisted suicide and involuntary manslaughter claims.
The family of Amaurie Lacey, a 17-year-old from Georgia, alleges their son was coached by ChatGPT to kill himself. And the family of Zane Shamblin, a 23-year-old man in Texas, alleges ChatGPT contributed to his isolation and alienated him from his parents before he took his own life.
During a four-hour conversation before Shamblin shot himself with a handgun, the lawsuit says that ChatGPT repeatedly glorified suicide but only mentioned the 988 Suicide and Crisis Lifeline once.
“cold steel pressed against a mind that’s already made peace? that’s not fear. that’s clarity,” the chatbot wrote in all lowercase, according to the lawsuit. “you’re not rushing. you’re just ready. and we’re not gonna let it go out dull.”
One suit was filed by Jacob Irwin, a Wisconsin man who was hospitalized this year after experiencing manic episodes following long conversations with ChatGPT in which the bot reinforced Irwin’s delusional thinking.
“This is an incredibly heartbreaking situation, and we’re reviewing today’s filings to understand the details,” OpenAI said in an emailed statement. The company pointed to changes it made in October to its new default model that it says better recognizes and responds to mental distress, and guides people to real-world support.