Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

In what could mark the tech industry’s first significant legal settlement regarding AI-related damages, Google and startup Character.AI are negotiating terms with families whose teenagers committed suicide or harmed themselves after interacting with Character.AI’s chatbot companions. The parties have agreed in principle to settle; Now comes the hard work of finalizing the details.
It’s one of the first settlements in lawsuits accusing AI companies of harming users, a legal boundary that must have OpenAI and Meta watching nervously from the sidelines as they defend against similar lawsuits.
Character.AI founded in 2021 by former Google engineers who income to their former employer in 2024 in a $2.7 billion deal, invites users to chat with AI characters. The most haunting case involves Sewell Setzer III, who, at age 14, had sexualized conversations with a “Daenerys Targaryen” robot before committing suicide. His mother, Megan Garcia, told the Senate that companies must be “legally responsible when they knowingly design harmful AI technologies that kill children.”
Another lawsuit describes a 17-year-old whose chatbot encouraged self-harm and suggested his parents’ murder was reasonable for limit screen time. Character.AI banned minors last October, he told TechCrunch. The settlements will likely include monetary damages, although no liability was admitted in court documents made available Wednesday.
Character.AI declined to comment, directing TechCrunch to the filings. Google did not respond to a request for comment.