What happened to Sewell Setzer? Game of Thrones chatbot incident explained as mother files lawsuit over son's death
Sewell Setzer’s mother, Megan Garcia has filed a civil lawsuit against character.ai as she claimed that the artificial intelligence-powered chatbot “abused and preyed” on her son and played a compliance in his death.
In her lawsuit filed in Florida federal court on Wednesday, October 23, 2024, Garcia accused character.ai of aggravating Sewell’s depression. As per The Guardian, Megan also stated that her son used the chatbot all the time, and also claimed:
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life. Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”
character.ai was prompt to reply to the allegations, as they denied what Megan has stated in the lawsuit, and tweeted:
“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”
Disclaimer: This article contains details about suicide and self-harm. Readers' discretion is advised.
For the unversed, Sewell Setzer, a 14-year-old from Flordia, reportedly committed suicide on February 28. According to USA Today, the teenager shot himself in the head with his stepfather’s pistol, just moments after he logged onto the AI chatbot.
As per the media outlet, Sewell allegedly developed a romantic relationship with one of the AI bots, using the nickname Daenerys Targaryen, a character in Game of Thrones.
What happened to Sewell Setzer? More about the lawsuit explored as the teen's mother files a complaint
After Sewell Setzer passed away tragically, Megan Garcia expressed that the incident took place just moments after he logged into the AI chatbot. Her complaint reads:
"Megan Garcia seeks to prevent C.AI from doing to any other child what it did to hers, and halt continued use of her 14-year-old child’s unlawfully harvested data to train their product how to harm others.”
Furthermore, her lawsuit also mentioned that the age rating of the chatbot was not changed to 17 or 18 plus, at the time when Sewell had just started using the platform. The complaint reads:
“The chatbot is responsible for its failure to provide adequate warnings to minor customers and parents of the foreseeable danger of mental and physical harms arising from the use of their C.AI product.”
The lawsuit also states that Sewell began using character.ai on April 14, 2023, shortly after turning 14. Soon after, his mental health reportedly took a sharp decline. By May or June, Sewell became noticeably withdrawn, started isolating himself in his bedroom, and even quit his school's Junior Varsity basketball team.
The lawsuit claims that he frequently got into trouble at school and made repeated attempts to retrieve his phone from his parents. He would also search for old devices to continue accessing character.ai. By late 2023, he had started using his cash card to pay for the platform's $9.99 premium subscription.
Sewell’s therapist reportedly diagnosed him with anxiety and disruptive mood disorder. The mother of the deceased teen also shared a screenshot where Sewell Setzer wrote to the bot:
“I promise I will come home to you. I love you so much, Dany.”
To this, the chatbot replied:
“I love you too, Daenero. Please come home to me as soon as possible, my love.”
Due to the incident, character.ai released a new blog post on Tuesday, October 22, 2024, stating that the company will now introduce new measures, and change the model so that minors would not be able to access it.