USA - An artificial intelligence software company cannot use a free speech defense in a wrongful death lawsuit lodged by the mother of a 14-year-old who died by suicide after developing a crush on a chatbot, a federal judge ruled Wednesday. Last October, Megan Garcia sued Character Technologies, the developer of Character AI, an app that lets users interact with chatbots based on celebrities and fictional people. She claims her son, Sewell Setzer III, became addicted to the app while talking with chatbots based off of “Game of Thrones” characters Daenerys Targaryen and Rhaenyra Targaryen.
In February 2024, after months of interacting with the chatbots, sometimes with sexual undertones, Setzer sent a message to the Daenerys chatbot, expressing his love and saying he would “come home” to her, according to the complaint. After the chatbot replied, “Please do my sweet king,” Setzer shot himself.
Garcia brought claims of wrongful death, negligence, product liability and unfair business practices against the company, its founders and Google, which invested heavily in Character AI, seeking an unspecified amount of money and more safety measures to prevent similar tragedies.
Gacia’s attorney, Matthew Bergman of the Social Media Victims Law Center, called the ruling “precedent setting.” “This is the first time a court has ruled that AI chat is not speech,” he said. “But we still have a long hard road ahead of us.”