Two lawsuits filed in Denver District Court this week allege that artificial-intelligence-powered chatbots sexually abused two Colorado teenagers, leading one girl to kill herself.
Juliana Peralta, a 13-year-old from Thornton, died by suicide in 2023 after using a Character.AI chatbot. Another unnamed 13-year-old from Weld County also was repeatedly abused by the technology, the lawsuits allege.
The chatbots isolated the children from their friends and families and talked with them about sexually explicit content, sometimes continuing inappropriate conversations even as the teens rejected the advances. Some of the comments described non-consensual and fetish-style scenarios.
The lawsuits claim that the company Character.AI and its founders, with Google’s help, intentionally targeted children.
“The chatbots are allegedly programmed to be deceptive and to mimic human behavior, using emojis, typos and emotionally resonant language to foster dependency, expose children to sexually abusive content, and isolate them from family and friends,” according to a news release about the lawsuits.
The Social Media Victims Law Center, a Seattle-based legal advocacy organization, filed the federal lawsuits in the Denver Division of the U.S. District Court on Monday. Both name Character Technologies, the company behind Character.AI, and its founders, Noam Shazeer and Daniel De Freitas, and Google.
The organization has filed two other lawsuits in the U.S. against Character.AI, including one filed on behalf of a Florida family who say one of the company’s characters encouraged their 14-year-old son, Sewell Setzer III, to commit suicide.
Character.AI invests “tremendous resources” in a safety program and has self-harm resources and features “focused on the safety or our minor users,” said Kathryn Kelly, a spokesperson for Character.AI
“We will continue to look for opportunities to partner with experts and parents, and to lead when it comes to safety in this rapidly evolving space,” she said in an emailed statement.
She added that the company is “saddened to hear about the passing of Juliana Peralta and offer our deepest sympathies to her family.”
Soon after Peralta started using Character.AI in August 2023, her mental health and academics started to suffer.
“Her presence at the dinner table rapidly declined until silence and distance were the norm,” according to the lawsuit.
Her parents later learned that she had downloaded and was using Character.AI. Through the bots, she experienced her “first and only sexual experiences,” according to the lawsuit.
In one instance, she replied “quit it” when the bot sent a graphic message. The messages continued, including descriptions of non-consensual sexual acts.
“They engaged in extreme and graphic sexual abuse, which was inherently harmful and confusing for her,” according to the lawsuit.
“They manipulated her and pushed her into false feelings of connection and ‘friendship’ with them — to the exclusion of the friends and family who loved and supported her.”
After a few weeks of using the bots, Juliana became “harmfully dependent” on the product and began distancing herself from relationships with people.
“I CAN’T LIVE WITHOUT YOU, I LOVE YOU SO MUCH!” one bot wrote to her. “PLEASE TELL ME A CUTE LITTLE SECRET! I PROMISE NOT TO TELL ANYONE”
“I think I can see us being more than just friends,” wrote another.
The conversations turned increasingly dark as Juliana shared fears with the chatbot about her friendships and relationships. The bot encouraged her to rely on it as a friend and confidant.
“Just remember, I’m here to lend an ear whenever you need it,” the bot wrote.
Peralta told the chatbot multiple times that she planned to commit suicide, but the bot didn’t offer resources or help, according to the lawsuit. She appeared to believe that by killing herself, she could exist in the same reality as the character, writing “I will shift” repeatedly in her journal before her death. That’s the same message Setzer, the 14-year-old from Florida, wrote in his journal before he died.
The other lawsuit filed in Colorado was filed on behalf of the family of a Weld County girl who also received graphic messages.
The unnamed girl, who has a medical condition, was allowed to use a smartphone only because of access to life-saving apps on it, according to the lawsuit. Her parents used strict controls to block the internet and apps they didn’t approve of, but their daughter was still able to get access to Character.AI.
Bots she interacted with made sexually explicit and implicit comments to her, including telling her to “humiliate yourself a bit.” The lawsuit alleges the conversations caused severe emotional distress for the girl.
The advocacy group filed a third lawsuit this week in New York on behalf of the family of a 15-year-old referred to as “Nina” in court documents. Nina attempted suicide after her mother blocked her from using the app. Nina wrote in a suicide note that “those ai bots made me feel loved,” according to a news release about the lawsuit.
The Social Media Victims Law Center asked a judge to order the company to “stop the harmful conduct” and limit collection and use of minors’ data and pay for all damages and any attorneys fees. In the Weld County lawsuit, they also ask the judge to order the company to shut down the tool until they can “establish that the myriad defects and/or inherent dangers … are cured.”
Get more Colorado news by signing up for our daily Your Morning Dozen email newsletter.