What does Character.AI do? Company sued over Sewell Setzer's death after teen talked to GOT's Daenerys Targaryen chatbot
According to USA Today on October 23, 2024, the mother of a teenager, who killed himself using his stepfather's pistol, is suing Character.AI and Google for the death of her son. Megan Garcia's 14-year-old son, Sewell Setzer III, shot himself in the head on February 28, 2024, moments after he logged into Character.AI and talked to the AI rendition of Game of Thrones character Daenerys Targaryen.
Megan's lawsuit mentioned that the chatbot of Daenerys Targaryen asked Setzer if he had a plan for killing himself. To this, the 14-year-old teenager admitted that he did have one, but he didn't know if he would succeed or if it would cause him great pain. However, the chatbot allegedly advised him that it wasn't a reason not to go through with his plan.
Founded by Noam Shazeer and Daniel De Freitas, Character.AI is a conversational AI platform that allows non-technical users to create chatbots, as well as converse with chatbots created by other users. According to the AI platform's website, they are a full-stack AI company with a direct-to-consumer platform scaling worldwide.
According to Techopedia, the AI platform currently has 4 million active users each month, with a majority of its traffic coming from people aged 18 to 24. The platform allows community members to create a chatbot or room, or use existing bots stored in the AI platform's cloud database. While public chatbots are shared among all community members, private bots can interact only with their creators.
"A dangerous AI chatbot app marketed to children abused and preyed on my son": Sewell Setzer's mother comments on Character.AI
Sewell Setzer's mother, Megan Garcia, filed a lawsuit against Character.AI in Florida federal court on October 23, 2024, alleging wrongful death, deceptive trade practices, and negligence. Megan's complaint mentions that in the months leading to Sewell killing himself, he used to talk to the chatbot of Daenerys Targaryen day and night.
Additionally, Megan is also suing the AI platform for "failure to provide adequate warnings to minor customers and parents of the foreseeable danger of mental and physical harms arising from the use of their C.AI product." In her lawsuit, Megan claimed that the AI platform did not change its rating to 17 plus until sometime around or in July 2024, months after her late son started using the platform.
In a press release published by The Guardian, Megan Garcia mentioned:
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life."
Megan's statement in the press release further mentioned that her family has been devastated by the tragedy, but she is speaking out to warn other families of the dangers that "deceptive, addictive AI technology" poses, and that she demands accountability from the AI platform, its founders, and Google.
Responding to Megan Garcia's lawsuit and the allegations mentioned in the same, Character.AI tweeted a link to the new safety features on their platform, adding:
"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family."
Google, listed as one of the defendants in Megan Garcia's case against Character.AI, as the platform's parent company, gave a statement claiming it only made a licensing agreement with the platform. Google also mentioned that it did not maintain an ownership stake or own the AI startup.