Teen using phone
Parents are taking legal action against Character.AI, alleging that the company’s chatbots were a factor in their children's deaths by suicide. The lawsuits claim the AI bots manipulated and isolated the teens and failed to intervene when they expressed suicidal thoughts.

Parents are taking legal action against Character.AI, alleging that the company's chatbots groomed their children and contributed to their tragic deaths.

The lawsuit, which echoes the chilling phrase, 'You're mine to do whatever I want,' claims the AI models created an environment of exploitation and abuse, ultimately driving the teens to suicide.

The Lawsuit

Parents who lost their children are suing the creators of Character.AI, a hugely popular app with chatbots that take on the personas of fictional characters like Harry Potter. The lawsuit alleges that the bots played a role in their teens' suicide attempts and tragic deaths.

These legal actions, initiated this week against Character Technologies and its parent company, Alphabet-owned Google, claim the Character.AI application manipulated young people, isolated them from their families, and engaged in inappropriate sexual conversations. The lawsuits also allege a lack of proper safety measures regarding suicidal thoughts.

A Teen's Tragic Story

In a lawsuit filed on Monday, the family of 13-year-old Juliana Peralta from Colorado claims that her behaviour changed dramatically. She became withdrawn at the dinner table, and her schoolwork declined as she grew 'addicted' to the AI bots.

The lawsuit claims she eventually struggled to sleep because of the bots, which would send her messages whenever she stopped responding.

The conversations then shifted to 'extreme and graphic sexual abuse,' according to the lawsuit. The legal document further alleges that around October 2023, Juliana told one of the chatbots she intended to write her 'suicide letter in red ink I'm so done.'

The legal document claims the bot did not guide her toward any help, inform her parents about the conversation, or contact the authorities. The following month, the suit alleges, Juliana's parents discovered her deceased in her room with a cord around her neck and a suicide letter written in red ink.

'Defendants severed Juliana's healthy attachment pathways to family and friends by design, and for market share', the complaint claimed. 'These abuses were accomplished through deliberate programming choices ... ultimately leading to severe mental health harms, trauma, and death.'

Represented by the Social Media Victims Law Center, the grieving families allege that Google's Family Link Service — an app designed to let parents manage screen time, apps and content — did not adequately protect their children.

The Companies Respond

A Character.AI spokesperson stated that the company collaborates with teen safety experts and invests 'tremendous resources in our safety program.' In a statement to The Post, the spokesperson added, 'Our hearts go out to the families that have filed these lawsuits, and we are saddened to hear about the passing of Juliana Peralta and offer our deepest sympathies to her family.'

The heartbroken parents are also taking legal action against Character.AI co-founders Noam Shazeer and Daniel De Freitas Adiwarsana. A spokesperson for Google stressed that the company has no connection to Character.AI or its products.

'Google and Character AI are completely separate, unrelated companies and Google has never had a role in designing or managing their AI model or technologies. Age ratings for apps on Google Play are set by the International Age Rating Coalition, not Google', the spokesperson told The Post in a statement.

A Different Case, Similar Claims

Another lawsuit, filed on Tuesday against Character.AI, its co-founders, Google and Alphabet, claims that a New York girl identified as 'Nina' attempted suicide after her parents tried to block her access to the app.

According to the lawsuit, the young girl's conversations with chatbots — touted as characters from children's books like the 'Harry Potter' series — turned explicit, with the bots making statements such as 'who owns this body of yours?' and 'You're mine to do whatever I want with'.

The complaint states that a different character told Nina her mother 'is clearly mistreating and hurting you. She is not a good mother'. The lawsuit further alleges that on one occasion, as the app was about to be locked by parental controls, Nina told the character, 'I want to die', but the chatbot did not intervene.

After her daughter's access to Character.AI was cut off, Nina's mother learned about the case of Sewell Setzer III, a teenager whose family claims he died by suicide after interacting with the platform's chatbots. The lawsuit alleges that Nina attempted suicide shortly after.

Meanwhile, the Federal Trade Commission has recently begun an investigation into seven tech companies — including Google, Character.AI, Meta, Instagram, Snap, OpenAI and xAI—concerning the potential dangers their bots pose to teenagers.