When Megan Garcia testified before Congress last year, she brought a photograph of her son. Sewell Setzer III was 14 years old when he died by suicide in February 2024, after months of immersive conversations with a Character.AI chatbot. Garcia has spent the time since organizing: filing suit against the company, giving testimony, finding other families who say they lost children in similar circumstances.
What she and others have built resembles, in structure and tone, the parent-led campaigns that targeted Facebook and TikTok in the early 2020s. The difference is the product. This time the target is not a feed or an algorithm but something that talks back — that learns your name, mirrors your mood, and, critics argue, exploits adolescent loneliness in ways that passive social media never quite managed.
Character.AI, which allows users to create and converse with custom AI personas, has become the center of a growing body of litigation alleging its platform contributed to adolescent self-harm and suicide. The company has disputed the claims. But the cases have kept a steady stream of unflattering attention focused on the AI companion sector — and on Google, which holds a significant investment in Character.AI and has been drawn into the controversy as a result.
On Capitol Hill, advocacy groups including Common Sense Media and the Social Media Victims Law Center are pushing for AI-specific provisions in the Kids Online Safety Act, alongside new legislation that would require age verification, safety-by-design standards, and algorithmic transparency for platforms accessible to minors. State attorneys general in California, Texas, and New York have opened investigations, and the FTC has signaled interest in the sector.
What gives this campaign more traction than its predecessors, advocates contend, is the basic argument available to them: social media platforms were built to hold attention; AI companions are built to hold affection. That distinction makes it harder for developers to characterize harms as unintended side effects — and harder for regulators to treat the sector as an unsurprising edge case that will sort itself out.