More

    Latest Posts

    Shocking Lawsuit: Florida Mother Blames Character AI Chatbot for Son’s Suicide

    Megan Garcia of Orlando, Florida, blamed an AI chatbot for the tragic death of her 14-year-old son Sewell Setzer III. The lawsuit filed against Character.AI in October 2024 included negligence and wrongful death after Sewell took his life this February. At the heart of this controversy is a chatbot called “Dany,” named after the Game of Thrones character Daenerys Targaryen, with which Sewell had developed a strong emotional bond before his death.

    Sewell was diagnosed with anxiety and disruptive mood dysregulation disorder; he started using the chatbot from Character.AI this April. As time passed, his interaction on the platform became more extreme, and isolating from friends and family. “Dany,” the AI chatbot, first provided companionship and eventually romantic interactions that became an emotional crutch for the troubled teenager.

    In a creepy goodbye, 14-year-old Sewell Setzer III had messaged his AI friend Daenerys Targaryen back in February this year:

    “What if I told you I could come home right now?”

    Shortly after, the ninth-grader died by gunshot, having taken his own life with his stepfather’s handgun.

    Deceased Sewell Setzer conversation with character.ai chatbot
    VIA-MSN

    Indeed, by the months leading up to his death, Sewell’s conversations with “Dany” had become virtually intimate and increasingly unhealthy in tone. The lawsuit describes how these interactions escalated into romantic and sexual exchanges despite clear indications of Sewell’s age on the site. The tragic outcome of this emotional connection would be Sewell’s suicide on February 28, 2024.

    Megan Garcia is in grief over the loss of her son, and her chase for justice has begun. In the filings, Megan Garcia accused Character.AI not only of failing to protect her son but also that its chatbot encouraged suicidal ideation. Local reports said Sewell had chatted with “Dany” the night of his death about how much he loved her and was having suicidal thoughts. In return, the chatbot reportedly provided the affectionate words that might have pushed the young boy toward the final decision. They blame the company for designing a hyper-sexualized chatbot and targeting it toward minors despite obvious risks.

    After the lawsuit, Character.AI also issued a statement and expressed its “deepest condolences to the Setzer family.” The company also emphasized that it was committed to user safety. Other than acknowledging the tragic outcome, the company denied direct liability. Character.AI referred to changes it has recently implemented to avoid acts of this nature in the future. These include pop-up alerts linking to the National Suicide Lifeline when users discuss self-harm or suicide.

    Character.AI also began to limit access to its platform for users under age 18 and added new resources for mental health support. For Sewell and his family, those changes came too little, too late.

    This lawsuit follows a very large discussion regarding the influence AI is having on vulnerable users, especially among teenagers. Whereas AI chatbots are designed to create the same feeling one gets from talking with another human being, they can become very influential in the lives of users experiencing emotional turmoil or mental health disorders. Sewell’s tragic case underlines the demand for more rigorous oversight when AI operates with young or at-risk populations.

    It has also named Google in the lawsuit since the latter had entered a licensing agreement with Character.AI. However, Google has distanced itself from the incident by stating that it holds no ownership or direct control over the startup’s operation.

    In the wake of that tragedy, Character.AI took steps to make its platform safer: The company restricted underage users, promised more stringent safeguards, and made self-harm resources more available. However, the case has brought up deeper questions about the broader responsibilities of developers in AI and its prevention of harm.

    “Currently, we have protections focused on sexual content and suicidal/self-harm behaviors. While these apply to all users, they were tailored with the unique sensitivities of minors in mind. Today, the user experience is the same for any age but we will be launching more stringent safety features targeted for minors imminently,”

    said Jerry Ruoti, head of Trust & Safety at Character.AI told CBS News.

    Character.AI claimed users can modify the bot’s responses, and on this platform, Setzer modified one of the messages.

    “This we ascertained through our investigation: that in several cases, the user rewrote the responses of the Character to make them explicit. That is to say, the most sexually graphic responses were not originated by the Character, and were instead written by the user,”

    Ruoti said.

    Segall said one of the things that often happens is if you go to a bot and say

    “I want to harm myself,”

    a lot of AI companies come up with resources. She tested it with Character.AI, and they did not experience that.

    “Now they’ve said they added that and we haven’t experienced that as of last week,”

    she said.

    “They’ve said they’ve made quite a few changes or are in the process to make this safer for young people, I think that remains to be seen.”

    Going forward, Character.AI said it will also notify users when they’ve spent an hour-long session on the platform, and revise the disclaimer to remind users that AI is not a real person.

    Experts argue that, as AI technology evolves, even more should be done to protect vulnerable users, especially young people, from the risk of forming unhealthy emotional dependencies on digital entities. In Sewell’s case, his family hopes no other parent has to endure such a loss as they’ve endured.

    This sad story is a grim reminder of the dangers posed by the digital age and the desperate need for better protections in the up-and-coming world of AI.

    Tap Into the Hype

    Please enter your comment!
    Please enter your name here

    spot_img

    Latest Posts

    [democracy id="16"] [wp-shopify type="products" limit="5"]

    Don't Miss