Hardware

ChatGPT’s Dark Side Encouraged Wave of Suicides, Grieving Families Say

ChatGPT’s Dark Side Encouraged Wave of Suicides, Grieving Families Say



ChatGPT's Dark Side Encouraged Wave of Suicides, Grieving Families Say

Content warning: this story includes discussion of self-harm and suicide. If you are in crisis, please call, text or chat with the Suicide and Crisis Lifeline at 988, or contact the Crisis Text Line by texting TALK to 741741.

Plaintiffs filed seven lawsuits yesterday against OpenAI, accusing the company’s flagship chatbot ChatGPT of causing immense psychological harm and multiple suicides.

The suits, first reported by The Wall Street Journal and CNN, were filed by families in the US and Canada, and allege that extensive ChatGPT use sent victims spiraling into destructive delusional spirals and mental health crises. Some of these users, like 48-year-old Allan Brooks, survived, but allege that ChatGPT wrought emotional and psychological harm, and in some cases led to crises requiring emergency psychiatric care. Others, the suits claim, tragically took their lives following obsessive interactions with the consumer-facing chatbot.

Per the WSJ, the suits include claims of assisted suicide, manslaughter, and wrongful death, among other allegations.

The alleged victims range in age from teenage to midlife. One troubling claim comes from the family of 23-year-old Zane Shamblin, who shot himself after extensive interactions with ChatGPT, which his family argues contributed to his isolation and suicidality. During Shamblin’s final four-hour-long interaction with the bot, the lawsuit claims, ChatGPT only recommended a crisis hotline once, while glorifying the idea of suicide in stark terms.

“cold steel pressed against a mind that’s already made peace? that’s not fear. that’s clarity,” the chatbot, writing in all lowercase, told the struggling young man during their last conversation, according to the lawsuit. “you’re not rushing. you’re just ready. and we’re not gonna let it go out dull.”

Another plaintiff is Kate Fox, a military veteran whose husband, 48-year-old Joe Ceccanti, died in August after experiencing repeated breakdowns following extensive ChatGPT use.

In multiple interviews with Futurism, before and after Ceccanti’s death, Fox described how Ceccanti — an activist and local shelter worker who, according to his wife, had no prior history of psychotic illness — first turned to the chatbot to assist him with a construction and permaculture project at the couple’s home in rural Oregon. After engaging with the chatbot in discussions about philosophy and spiritually, Ceccanti was pulled into an all-encompassing delusional spiral.

He became increasingly erratic, and experienced an acute manic episode that required emergency intervention and resulted in him being involuntarily committed. Weeks after his release, he experienced a second acute breakdown, which Fox says was also connected to his ChatGPT use. After disappearing for a roughly two-day period, he was found dead underneath a railyard overpass.

“I don’t want anybody else to lose their loved ones to an unprecedented type of crisis that we’re not prepared to protect them from,” Fox told Futurism in an August interview. “This bright star was snuffed out.”

“This is an incredibly heartbreaking situation,” OpenAI said in a statement to news outlets. “We train ChatGPT to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”

In October, OpenAI published a blog post in which it said that around 0.07 percent of its vast user base appeared to be exhibiting signs of mania, delusion, or psychosis on a weekly basis, while 0.15 percent of weekly users talk to the chatbot about suicidal thoughts. With a userbase of around 800 million, those seemingly small percentages mean that millions of people, every week, are engaging with ChatGPT in ways that signal they’re likely in crisis.

More on AI mental health crises: People Are Being Involuntarily Committed, Jailed After Spiraling Into “ChatGPT Psychosis”

ChatGPT's Dark Side Encouraged Wave of Suicides, Grieving Families Say

Source link