In brief
- Google and Character.AI agreed to settle a landmark lawsuit filed by a Florida mother who alleged the startup’s chatbot led to her son’s suicide in February 2024.
- The case was one of the first U.S. lawsuits holding AI companies accountable for alleged psychological harm to minors.
- The settlement comes after Character.AI banned teenagers from open-ended chatting in October.
A mother’s lawsuit accusing an AI chatbot of causing her son psychological distress that led to his death by suicide in Florida nearly two years ago has been settled.
The parties filed a notice of resolution in the U.S. District Court for the Middle District of Florida, saying they reached a “mediated settlement in principle” to resolve all claims between Megan Garcia, Sewell Setzer Jr., and defendants Character Technologies Inc., co-founders Noam Shazeer and Daniel De Freitas Adiwarsana, and Google LLC.
“Globally, this case marks a shift from debating whether AI causes harm to asking who is responsible when harm was foreseeable,” Even Alex Chandra, a partner at IGNOS Law Alliance, told Decrypt. “ I see it more as an AI bias ‘encouraging’ bad behaviour.”
Both requested the court stay proceedings for 90 days while they draft, finalize, and execute formal settlement documents. Terms of the settlement were not disclosed.
Megan Garcia filed the lawsuit after the death of her son Sewell Setzer III in 2024, who died by suicide after spending months developing an intense emotional attachment to a Character.AI chatbot modeled after “Game of Thrones” character Daenerys Targaryen.
On his final day, Sewell confessed suicidal thoughts to the bot, writing, “I think about killing myself sometimes,” to which the chatbot responded, “I won’t let you hurt yourself, or leave me. I would die if I lost you.”
When Sewell told the bot he could “come home right now,” it replied, “Please do, my sweet king.”
Minutes later, he fatally shot himself with his stepfather’s handgun.
Ishita Sharma, managing partner at Fathom Legal, told Decrypt the settlement is a sign AI companies “may be held accountable for foreseeable harms, particularly where minors are involved.”
Sharma also said the settlement “fails to clarify liability standards for AI-driven psychological harm and does little to build transparent precedent, potentially encouraging quiet settlements over substantive legal scrutiny.”
Garcia’s complaint alleged Character.AI’s technology was “dangerous and untested” and designed to “trick customers into handing over their most private thoughts and feelings,” using addictive design features to increase engagement and steering users toward intimate conversations without proper safeguards for minors.
In the aftermath of the case last October, Character.AI announced it would ban teenagers from open-ended chat, ending a core feature after receiving “reports and feedback from regulators, safety experts, and parents.”
Character.AI’s co-founders, both former Google AI researchers, returned to the tech giant in 2024 through a licensing deal that gave Google access to the startup’s underlying AI models.
The settlement comes amid mounting concerns about AI chatbots and their interactions with vulnerable users.
Giant OpenAI disclosed in October that approximately 1.2 million of its 800 million weekly ChatGPT users discuss suicide weekly on its platform.
The scrutiny heightened in December, when the estate of an 83-year-old Connecticut woman sued OpenAI and Microsoft, alleging ChatGPT validated delusional beliefs that preceded a murder-suicide, marking the first case to link an AI system to a homicide.
Still, the company is pressing on. It has since launched ChatGPT Health, a feature that allows users to connect their medical records and wellness data, a move that is drawing criticism from privacy advocates over the handling of sensitive health information.
Decrypt has reached out to Google and Character.AI for further comments.
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.
Source: https://decrypt.co/353927/google-character-ai-settle-us-lawsuit-teens-suicide