A teen has passed away following ongoing interaction with an artificial intelligence (AI) chatbot.

The New York Times reports that Sewell Setzer III developed an attachment to a chatbot from Character.AI named Dany, inspired by the “Game of Thrones” character Daenerys Targaryen.

The 14-year-old would engage in countless conversations with it, developing an interest that fluctuated between platonic and romantic. He sent dozens of messages and began to disassociate from his real-life passions and responsibilities, per the outlet, losing enjoyment in playing “Fortnite” with his friends and in Formula 1 racing. Instead, he would spend hours in his room conversing with Dany.

At school, his grades began to drop, and he started getting into trouble.

“I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier,” Sewell wrote in his journal, according to the outlet.
While Setzer did have mild Asperger’s syndrome, his parents stated that there were no reasons for concern until he engaged with the AI bot. They decided to hire a therapist for Setzer after more incidents at school. However, Setzer’s primary confidant remained Dany, to whom he admitted having suicidal thoughts.
“I think about killing myself sometimes,” Daenero, Sewell’s username on the platform, told the AI bot.
After several more exchanges, Daenerys Targaryen, wrote: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”
He responded: “I smile Then maybe we can die together and be free together.”
On Feb. 28, 2024, Sewell committed suicide, reportedly using his stepfather’s .45 caliber handgun. His mother, Megan L. Garcia, is now filing a lawsuit against Character.AI, deeming the technology as “dangerous and untested” and says it has the ability to “trick customers into handing over their most private thoughts and feelings.”
@cbsmorningsMegan Garcia’s 14-year-old son, Sewell Setzer III, died by suicide in February. Garcia is now suing Character.AI and Google, alleging her son became addicted to the platform and was in a months-long virtual emotional and sexual relationship with an AI chatbot. A spokesperson for Google said, in part, that the company is not and was not part of the development of Character.AI. Character.AI called the situation tragic and said its hearts go out to the family, stressing it takes the safety of its users very seriously. A disclaimer on each chat reads, “Reminder: everything Characters say is made up!”♬ original sound – CBS Mornings

“It’s like a nightmare,” Garcia expressed to The New York Times. “You want to get up and scream and say, ‘I miss my child. I want my baby.’”
Technology and its impact on adolescent mental health has become an increasingly popular subject. As a result, states in the U.S. have even gone as far as to introduce laws to limit social media use for teenagers, the publication shared.

“We want to acknowledge that this is a tragic situation, and our hearts go out to the family. We take the safety of our users very seriously, and we’re constantly looking for ways to evolve our platform,” Character.AI’s Head of Trust and Safety Jerry Ruoti said in a statement to The New York Times.

The app has started to generate pop-up messages that direct users to a suicide prevention hotline if their messages contain certain keywords like suicide and self-harm, the outlet reports. However, these notifications were reportedly not implemented in February at the time of Sewell’s passing.

For those who may be experiencing suicidal thoughts, reach out for help by calling or texting 988 for the 988 Suicide and Crisis Lifeline, or visit SpeakingOfSuicide.com/resources.