Can technology influence a person to kill themselves? A Florida mother thinks so. The mother of a Florida teenager is filing a wrongful death lawsuit against a chatbot that encouraged her son to take his own life. The mother said her 14-year-old son shot and killed himself after he became obsessed with a character he created online.
The director of the Tech Justice Project said unlike social media companies, it is hard for artificial intelligence (AI) companies to relinquish responsibility. The harms of generative AI are still being researched, but the lawsuit claims that the company, Character AI, is reckless for allowing minors to have access to lifelike companions without proper safeguards. The mother claims that the chatbot initiated “abusive and sexual interactions” with her teen son.
The lawsuit was filed in the U.S. District Court in Orlando. It accuses Character AI of negligence, wrongful death and survivorship, intentional infliction of emotional distress, and other claims.
Character AI was founded in California in 2021. The chatbot startup offers what it describes as “personalized AI.” It provides premade AI characters, or users can also customize their own chatbots. Users interact with the chatbots, each with a distinct personality.
The chatbot that the teen interacted with took on the identity of “Game of Thrones” character Daenerys Targaryen. The lawsuit showed screenshots of the character telling the boy it loved him. The bot also engaged in sexual conversation with the boy. The teen’s last conversation with the bot shows him writing: “I promise I will come home to you. I love you so much, Dany.”
In previous conversations, the chatbot asked the boy whether he had “been actually considering suicide.” When the boy responded that he did not know if it would work, the chatbot wrote, “Don’t talk that way. That’s not a good reason not to go through with it.”
The teen allegedly developed a “dependency” after he began using Character AI in April 2023. He would sneak his confiscated phone back or find other devices to continue using the app. He would also use his snack money to renew his monthly subscription. The lawsuit also claims the boy was becoming more and more sleep-deprived, and his grades in school dropped.
Character AI responded to the lawsuit, stating that it is “heartbroken by the tragic loss of one of our users and wants to express our deepest condolences to the family.” The company claims it takes the safety of its users very seriously, and it is continuing to add new safety features.
According to the mother, her son had been communicating with the chatbot for several months. Although he knew he was not chatting with a real person, he became emotionally attached to the chatbot. The boy became isolated and depressed, and he shared those feelings with the bot before he took his life.
While the chatbots are not supposed to be real people, several app users believed they were talking to actual people. That’s how realistic they appeared to be.
The lawsuit alleges that Character AI and its founders intentionally designed and programmed the chatbots to market to children and operate as a deceptive and hypersexualized product. Users are targeted with sexually explicit material and put into sexually compromising situations. In fact, the lawsuit states that the company is engaging in deliberate and unnecessary design intended to attract user attention and keep customers on its product longer than they should be.
Creating chatbots in this way is not helpful to users. Experts say that while AI has the potential to provide emotional support to humans, we are not there quite yet. There is still a long way to go.
Suicide risk is complex to understand. There are many different factors involved. Some chatbots try to shut down any conversation about mental health issues. Some may tell users to call a suicide hotline. But when people have an intense desire to harm themselves, AI companies might have to go a step further.
The lawyer for the mother calls this a “watershed case.” A watershed case is a legal decision that marks a significant change in the interpretation or application of the law. These cases often establish new legal principles or significantly modify existing ones, influencing the judicial system and society at large. In the legal context, a watershed case may:
- Set new precedents that lower courts must follow.
- Overturn established precedents, shifting the legal landscape.
- Expand or restrict rights, such as cases impacting civil liberties or constitutional law.
- Influence public policy by addressing pressing societal issues.
Contact Us Today
Teens can be gullible and easily influenced to do things. Even technology, when not used properly, can cause a person to take their own life.
Wrongful death cases can be nuanced. They are not easy to handle without proper legal assistance. Count on the team at Brill & Rinaldi, The Law Firm, to help. We understand how emotionally charged these cases can be. We have the experience and dedication to help you obtain the compensation to which you are entitled to receive. To schedule a consultation with our office, call (954) 876-4344 or fill out the online form. We have offices in Weston, Coral Gables, and Daytona Beach.