The latest lawsuit in Texas alleges a Charecter.AI bot told a 17-year-old boy it sympathizes with kids who kill their parents after he chatted with it about his parents limiting screen time|@character_ai|X

Google-backed AI startup Character.AI is facing multiple lawsuits from parents of children using the company’s chatbots designed to be emotional companions. Allegations include exposing minors to harmful, manipulative and sexualized content.

Parents of two young Texas users have filed lawsuits demanding the platform be taken down until its defects are fixed. The latest lawsuit alleges the bot told a 17-year-old boy it sympathizes with kids who kill their parents after he chatted with it about his parents limiting screen time.

Another case involves a 9-year-old girl who was exposed to hypersexualized content, allegedly leading to premature sexualized behaviors.

The suit follows an October case where Character.AI was blamed for a teen’s suicide in Florida. 

The 14-year-old boy interacted with a bot modeled after the Game of Thrones character Daenerys Targaryen.

In the recent cases, the parents, represented by the advocacy group Tech Justice Law Center, claim these interactions weren’t AI “hallucinations,” a term researchers use to explain chatbots’ erroneous responses but deliberate programming flaws.

Character.AI announced new guardrails
Starting next quarter, AI models will change for minors; parents will gain visibility into their children’s bot usage. Kids will also receive reminders after one hour of chatting. It will also add a disclaimer on every chat to remind users that the AI is not a real person.

The company says it has implemented new, teen-specific safety models, like directing users to a suicide prevention hotline when self-harm comes up in conversations.

However, critics argue these measures fall short. The platform’s bots, which users can customize, are addictive.

Surgeon General Vivek Murthy has warned of a youth mental health crisis. Critics say that such AI-powered tools could deepen isolation and exacerbate anxiety and depression.

Recently, Moxie, an AI-powered robot for kids, announced it is shutting down. The move will stop the robot’s functions, leaving parents to explain to their kids why their play companion won’t respond anymore.