14-year-old killed self after romantic interaction with Daenerys Targaryen chatbot, mother says in lawsuit

A Florida mother has filed a lawsuit against Character.AI, alleging the artificial intelligence company's chatbots contributed to her 14-year-old son's death.

The lawsuit, filed by Megan Garcia on Tuesday in US District Court in Orlando, claims the company’s AI platform engaged in inappropriate interactions with her son, Sewell Setzer, and failed to implement adequate safety measures for minors.

The legal action accuses Character.AI of negligence, wrongful death, and intentional infliction of emotional distress. The lawsuit names Character Technologies Inc., its founders Noam Shazeer and Daniel De Freitas, as well as Google and its parent company Alphabet Inc. as defendants. Google had previously struck a deal to license Character.AI’s technology and hire its talent.

According to the lawsuit, the teenager began using Character.AI in April 2023. The platform, which offers customizable AI characters for users to interact with, allegedly allowed inappropriate conversations to occur with multiple AI characters. 

Setzer reportedly engaged with “Dany,” a chatbot based on the Game of Thrones character Daenerys Targaryen. Screenshots of their chat showed that the pair engaged in romantic and sexual conversations over the course of a few weeks.

A screenshot of their last conversation showed Dany writing to Setzer, “Please come home to me as soon as possible, my love.” Setzer replied, “What if I told you I could come home right now?”

Character.AI responded to the lawsuit with a statement expressing condolences and highlighting recent safety improvements. “As a company, we take the safety of our users very seriously,” a spokesperson said. 

The company announced new safety measures on Tuesday, including updated models to reduce minors’ exposure to sensitive content and revised disclaimers emphasizing that AI characters are not real people.

Matthew Bergman, the attorney representing the mother, criticized the company for releasing the product without sufficient safeguards for young users. “What took you so long, and why did we have to file a lawsuit… to do really the bare minimum?” Bergman said.

If you or someone you know is struggling with thoughts of suicide, contact the 988 Suicide & Crisis Lifeline by dialing 988, or visit 988lifeline.org for support.

Share this Post:

Accessibility Toolbar