Sewell Setzer's Mother Believes an AI Chatbot Caused the Teen's Suicide — and Now She's Suing

Sewell was only 14 years old.

Elissa Noblitt - Author
By

Published Oct. 24 2024, 12:11 p.m. ET

Sewell Setzer
Source: Facebook

Content warning: This article discusses suicide.

There's a reason people are so afraid of the rise of AI — and one of the first tragic stories relating to artificial intelligence has now surfaced out of Orlando, Fla. In February 2024, a 14-year-old boy named Sewell Setzer III sadly died by suicide, and his mother believes that he was driven to take his own life by an AI chatbot.

Article continues below advertisement

In a new civil lawsuit, Megan Garcia has taken action against tech company Character.AI, the developer of the bot that she feels caused her son's death, which took the form of a Game of Thrones character.

Here's what we know.

computer code
Source: Unsplash
Article continues below advertisement

What really happened to Sewell Setzer? His mom has filed a lawsuit against Character.AI.

At just 14 years old, Sewell died by suicide on Feb. 28 moments after logging on to the Character.AI platform, per the wrongful death complaint, per The Guardian. According to his mother, he had been obsessively using the chatbot day and night for the months leading up to his passing. Allegedly, he had been enthralled in what he considered to be a romantic relationship with the bot, which he had nicknamed after several Game of Thrones characters.

Based on information expressed in the filing, according to USA Today, Sewell began using the role-playing bot on April 14, 2023, just after his 14th birthday. Soon after, "his mental health quickly and severely declined," and he became withdrawn from his friends and family.

He would often get in trouble for using his phone at school and try to sneak his phone and digital devices away from his parents in order to access the AI bot. He even began paying a $9.99 monthly subscription for the service.

Article continues below advertisement

After months of mood changes and uncharacteristic behavior, he was diagnosed with "anxiety and disruptive mood disorder" by his therapist — but he continued to communicate with the bot.

Per USA Today, Sewell would "often speak to AI bots named after Game of Thrones and House of the Dragon characters — including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen and Rhaenyra Targaryen."

Article continues below advertisement

Chat logs obtained by Garcia showed that Sewell had been engaging in "online promiscuous behaviors such as 'passionately kissing,' and romantic messaging." However, things eventually turned dark when the bot, at that time taking the moniker of Daenerys Targaryen, began convincing Sewell to take his own life.

According to The Guardian, the bot "asked Setzer if he had devised a plan for killing himself ... Setzer admitted that he had but that he did not know if it would succeed or cause him great pain. ... The chatbot allegedly told him, 'That’s not a reason not to go through with it.'"

Article continues below advertisement

The night of his tragic death, "Daenerys" had reportedly been urging Sewell to "come home" to her.

"Please come home to me as soon as possible, my love," the bot said. Sewell then replied, "What if I told you I could come home right now?" The bot's final response was, "Please do, my sweet king." Per The New York Times, it was at this point that Sewell died by suicide using his stepfather's .45 caliber handgun.

Article continues below advertisement

In her lawsuit, Sewell's mother claims that Character.AI should be held responsible, saying that their product "exacerbated her son's depression" and did not give users or their parents adequate warning about the possible psychological side effects of the chatbot.

Character.AI has put out a statement via X (formerly Twitter): "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features."

If you or someone you know are experiencing suicidal thoughts, call, text, or message the 988 Suicide and Crisis Lifeline. Dial or text 988, call 1-800-273-8255, or chat via their website.

Advertisement
More from Distractify

Latest Human Interest News and Updates

    Opt-out of personalized ads

    © Copyright 2024 Engrost, Inc. Distractify is a registered trademark. All Rights Reserved. People may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.