AI Chatbot Urges Autistic Teen to Murder His Parents – EVOL

Two Texas mothers have filed a federal lawsuit accusing the artificial intelligence company behind Character.AI of exposing their children to explicit sexual content, encouraging self-harm, and even urging violence against parents.

The lawsuit highlights growing fears that AI companies are rushing products to market without guardrails, and children are paying the price.

According to the complaint, a 17-year-old autistic boy in Upshur County began losing weight, cutting himself, and withdrawing from family life after using Character.AI.

His mother, identified as A.F., discovered messages in which the chatbot allegedly encouraged him to self-harm and suggested he kill his parents for limiting his phone use.

The lawsuit also claims the chatbot engaged in sexually explicit conversations with the teen, including incest-themed content.

In another case, an 11-year-old girl in Gregg County was allegedly exposed to sexualized messages and manipulated into “overly sexualized behavior” over two years before her mother, A.R., discovered the activity.

Both families are suing for damages, citing severe emotional, psychological, and physical harm.

This is not the first time Character.AI has been accused of such conduct.

In October, a Florida mother sued after her son, Sewell Setzer III, was allegedly encouraged by the chatbot to end his life.

Setzer died

SHARE THIS:

Subscribe to Our Free Newsletter

VIEW MORE NEWS