A teen has passed away following ongoing interaction with an artificial intelligence (AI) chatbot.
The New York Times reports that Sewell Setzer III developed an attachment to a chatbot from Character.AI named Dany, inspired by the “Game of Thrones” character Daenerys Targaryen.
The 14-year-old would engage in countless conversations with it, developing an interest that fluctuated between platonic and romantic. He sent dozens of messages and began to disassociate from his real-life passions and responsibilities, per the outlet, losing enjoyment in playing “Fortnite” with his friends and in Formula 1 racing. Instead, he would spend hours in his room conversing with Dany.
At school, his grades began to drop, and he started getting into trouble.
@cbsmorningsMegan Garcia’s 14-year-old son, Sewell Setzer III, died by suicide in February. Garcia is now suing Character.AI and Google, alleging her son became addicted to the platform and was in a months-long virtual emotional and sexual relationship with an AI chatbot. A spokesperson for Google said, in part, that the company is not and was not part of the development of Character.AI. Character.AI called the situation tragic and said its hearts go out to the family, stressing it takes the safety of its users very seriously. A disclaimer on each chat reads, “Reminder: everything Characters say is made up!”♬ original sound – CBS Mornings
“We want to acknowledge that this is a tragic situation, and our hearts go out to the family. We take the safety of our users very seriously, and we’re constantly looking for ways to evolve our platform,” Character.AI’s Head of Trust and Safety Jerry Ruoti said in a statement to The New York Times.
The app has started to generate pop-up messages that direct users to a suicide prevention hotline if their messages contain certain keywords like suicide and self-harm, the outlet reports. However, these notifications were reportedly not implemented in February at the time of Sewell’s passing.