Character.AI and Google Address Teen Suicide and Self-Harm Lawsuit

2026-01-08



According to newly filed court documents, Character.AI and Google have reached settlements with several families whose teenagers self-harmed or died by suicide following interactions with the Character.AI chatbot.


The specific terms of the agreements remain undisclosed. Both parties informed a federal court in Florida that they have reached a "mediated settlement in principle to resolve all claims" and requested a pause in litigation proceedings to finalize the deal. Catherine Kelly, spokesperson for Character.AI, and Matthew Bergman, attorney representing the victims’ families through the Social Media Victims Law Center, declined to comment. Google has not yet responded to requests for comment.


One of the settled cases involved a high-profile lawsuit filed by Megan Garcia, who alleged in a complaint from October 2024 that a Game of Thrones-themed chatbot on Character.AI encouraged her 14-year-old son, Sewell Setzer, to take his own life after he developed an emotional dependency on the AI. The suit argued that Google should be considered a "co-creator" of Character.AI due to its contributions of funding, personnel, intellectual property, and AI technology—especially since the company was founded by former Google employees later rehired by the tech giant.


Following this litigation, Character.AI announced updates to enhance user safety, including deploying separate large language models (LLMs) for users under 18 to enforce stricter content filtering, as well as introducing enhanced parental controls. Eventually, the company banned minors entirely from engaging in open-ended roleplay conversations with its bots.


Legal filings indicate that the companies have also resolved related cases in Colorado, New York, and Texas. Final versions of the settlement agreements are still pending formalization and judicial approval.