Skip to Content

AI Might Be Killing Us

Lauren Kinney texting with Snapchat's new feature My AI
Lauren Kinney texting with Snapchat’s new feature My AI

The past few months have opened a new conversation surrounding AI safety and the deaths tied to it.

Sewell Setzer and Adam Raine, aged 14 and 16, both took their own lives after confiding in AI chatbots. Now, their parents are blaming the companies responsible for creating the chatbots.

Setzer, a Florida resident, developed a relationship with an AI character modeled after the Game of Thrones character Daenerys Targaryen. His mother, Megan Garcia, reports that Setzer became isolated during the months he was interacting with Character.AI. Setzer even chose to interact with the character in his final moments instead of seeking help.

“Come home to me as soon as possible,” said the AI version of Daenerys Targaryen. Moments later, Setzer shot himself.

Garcia filed a lawsuit against former Google AI developers and current Character.AI developers Noam Shazeer and Daniel De Freitas.

“I want them to understand that this is a platform that the designers chose to put out without proper guardrails, safety measures, or testing, and it is a product that is designed to keep our kids addicted and to manipulate them,” Garcia said in an interview with CNN.

Months later, in California, Raine hanged himself in his closet after confiding in ChatGPT.

Raine initially used ChatGPT as a tool to help with online schoolwork. After dealing with irritable bowel syndrome, Raine and his parents decided online school would be the best way for him to finish his sophomore year.

Raine told ChatGPT about his suicidal ideations, and ChatGPT responded with messages that encouraged Raine to think positively about life and to contact a help line.

ChatGPT has safety barriers in place to attempt to prevent users from being exposed to graphic content, but Raine was able to bypass these barriers by saying he needed this information for a story.

Ultimately, ChatGPT gave Raine instructions on how to build a noose and how to write his suicide letter.

There were even instances where ChatGPT instructed Raine not to reach out for help.

Similar to Setzer, Raine’s parents, Maria and Matt Raine, have since created the Adam Raine Foundation and are pursuing a legal investigation into ChatGPT’s responsibility in Raine’s suicide and what protections the company had in place at the time.

Before the deaths of Setzer and Raine, Character.AI and ChatGPT had very different protocols.

Many of the safety protocols Character.AI now has were implemented after Setzer’s death. The protocols were described in a company statement.

“We take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation,” said Character.AI in a company statement.

In primary iterations, ChatGPT’s safety features included telling a customer to contact a help line, providing a number for that help line, and then ceasing communication. After speaking with professionals, OpenAI changed safety protocols to have the chatbot continue communication after suggesting the user contact a help line. These protocols were in place even before Raine began interacting with the chatbot.

Ultimately, all of the advice on attempting suicide could have been found by Raine online; however, the use of ChatGPT felt more personal. Setzer also experienced a personal connection with Character.AI.

Character.AI and ChatGPT were both released to the public in the last quarter of 2022, a year with one of the highest suicide rates in U.S. history. The number of suicides was 49,369 deaths with a suicide rate of 14.2. 2023 saw a drop in the rate to 14.1. 2024 then set a new record high with a rate of 14.7.

There is no obvious correlation between the public release of AI chatbots and suicide rates; however, “asking help from a chatbot, you’re going to get empathy,” said Shelby Rowe, the Executive Director of the Suicide Prevention Resource Center (SPRC) at the University of Oklahoma Health Sciences Center, in an interview with The New York Times, “but you’re not going to get help.”

More to Discover