Bored Panda works better on our iPhone app
Continue in app Continue in browser

Add post form topAdd Post
Tooltip close

The Bored Panda iOS app is live! Fight boredom with iPhones and iPads here.

14-Year-Old Boy’s Tragic Passing Sparks Concern Over AI Chat Dangers
421

14-Year-Old Boy’s Tragic Passing Sparks Concern Over AI Chat Dangers

ADVERTISEMENT

A 14-year-old boy tragically took his own life after forming a deep emotional attachment to an artificial intelligence (AI) chatbot. The boy, named Sewell Setzer III, had named the chatbot on Character.AI after the Game of Thrones character Daenerys Targaryen. His mother has since filed a lawsuit against Character.AI.

Trigger warning: self-harm, mental health struggle – Sewell developed an emotional attachment to the chatbot Daenerys Targaryen, which he nicknamed “Dany,” despite knowing it wasn’t a real person.

Highlights
  • 14-year-old Sewell formed a deep emotional attachment to a chatbot, leading to his tragic suicide.
  • The boy's mother filed a lawsuit against Character.AI, citing 'dangerous and untested' technology.
  • The chatbot, named after Daenerys Targaryen, became Sewell's main form of emotional support.
  • Character.AI has introduced new safety features following the tragedy to protect young users.

The ninth grader from Orlando, Florida, USA texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues, the New York Times reported on Wednesday (October 23).

Sewell had been using Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others.

RELATED:

    A 14-year-old boy tragically took his own life after forming a deep emotional attachment to an artificial intelligence (AI) chatbot

    Image credits: Paras Katwal

    The teen gradually became romantic and sexual with Dany, in addition to harboring a seemingly strong friendship with the bot.

    On the last day of his life, Sewell took out his phone and texted Dany: “I miss you, baby sister,” to which the bot replied: “I miss you too, sweet brother.”

    Sewell, who was diagnosed with mild Asperger’s syndrome as a child, reportedly preferred talking with Dany about his problems.

    ADVERTISEMENT

    Image credits: US District Court

    In one conversation, Sewell, using the name “Daenero,” told the chatbot that he hated himself and that he felt empty and exhausted, The Times reported.

    At some point, Sewell confessed that he was having thoughts of suicide, as he told Dany: “I think about killing myself sometimes.”

    The chatbot reportedly replied at the time: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?”

    The boy, Sewell Setzer III, had named the chatbot on Character.AI after the Game of Thrones character Daenerys Targaryen

    Image credits: US District Court

    Sewell went on to admit to the bot that he wanted to free himself “from the world” and himself. At first, Dany seemed to try to persuade the disturbed teen from hurting himself.

    However, the bot eventually encouraged suicide ideation, as one message sent from the bot read: “Please come home to me as soon as possible, my love,” to which Sewell replied: “What if I told you I could come home right now?”

    ADVERTISEMENT

    “… please do, my sweet king,” Dany replied.

    Image credits: US District Court

    The last text exchange occurred on the night of February 28, in the bathroom of Sewell’s mother’s house.

    The teen boy subsequently put down his phone, picked up his stepfather’s .45 caliber handgun, and reportedly used it to kill himself.

    Sewell’s parents and friends had no idea he’d fallen for a chatbot, the Times reported. They just noticed him getting sucked deeper into his phone. 

    His mother has since filed a lawsuit against Character.AI

    Image credits: US District Court

    Eventually, they noticed that he was isolating himself and pulling away from the real world, as per the American newspaper.

    Despite his autism diagnosis, Sewell never had serious behavioral or mental health issues before.

    However, the late teen’s grades started to suffer, and he began getting into trouble at school, the Times reported.

    ADVERTISEMENT

    “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier,” Sewell reportedly wrote one day in his journal.

    Image credits: Megan Fletcher Garcia

    Parents have been increasingly worried about the impact of technology on adolescent mental health, as per the Times

    Marketed as solutions to loneliness, certain apps reliant on AI can simulate intimate relationships. However, they could also pose certain risks to teens already struggling with mental health issues.

    Sewell’s mother, Megan L. Garcia, has since filed a lawsuit against Character.AI, accusing the company of being responsible for her son’s death. 

    A draft of the complaint said that the company’s technology is “dangerous and untested,” and that it can “trick customers into handing over their most private thoughts and feelings.”

    Sewell developed an emotional attachment to chatbot Daenerys Targaryen, which he nicknamed “Dany”

    ADVERTISEMENT

    Image credits: character_ai

    ADVERTISEMENT

    Character.AI, which was started by two former Google A.I. researchers, is the market leader in A.I. companionship.

    Last year, Noam Shazeer, one of the founders of Character.AI, said on a podcast: “It’s going to be super, super helpful to a lot of people who are lonely or depressed.”

    According to the Times, more than 20 million people use its service, which it has described as a platform for “superintelligent chatbots that hear you, understand you, and remember you.”

    “I feel like it’s a big experiment, and my kid was just collateral damage,” Garcia told the Times.

    Image credits: Megan Fletcher Garcia

    On Wednesday, Character.AI took to its official X page (formerly known as Twitter) to offer its sympathies to Sewell’s family, writing: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. 

    ADVERTISEMENT

    “As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”

    The company went on to share a link to its official website which outlines its new protective enhancements.

    As a result, Character.AI implemented new guardrails for users under the age of 18, including banning specific descriptions of self-harm or suicide

    After hiring a “Head of Trust and Safety and a Head of Content Policy,” Character.AI put in place a pop-up resource that is triggered when the user inputs certain phrases related to self-harm or suicide and directs the user to the National Suicide Prevention Lifeline.

    “This is absolutely devastating,” a reader commented

    ADVERTISEMENT
    ADVERTISEMENT
    Ic_polls

    Poll Question

    Thanks! Check out the results:

    Share on Facebook
    Andréa Oldereide

    Andréa Oldereide

    Writer, BoredPanda staff

    Read more »

    I’m a journalist who works as Bored Panda’s News Team's Senior Writer. The news team produces stories focused on pop culture. Whenever I get the opportunity and the time, I investigate and produce my own exclusive stories, where I get to explore a wider range of topics. Some examples include: “Doberman Tobias the viral medical service dog” and “The lawyer who brought rare uterine cancer that affects 9/11 victims to light”. You've got a tip? email me: andrea.o@boredpanda.com

    Read less »
    Andréa Oldereide

    Andréa Oldereide

    Writer, BoredPanda staff

    I’m a journalist who works as Bored Panda’s News Team's Senior Writer. The news team produces stories focused on pop culture. Whenever I get the opportunity and the time, I investigate and produce my own exclusive stories, where I get to explore a wider range of topics. Some examples include: “Doberman Tobias the viral medical service dog” and “The lawyer who brought rare uterine cancer that affects 9/11 victims to light”. You've got a tip? email me: andrea.o@boredpanda.com

    Donata Leskauskaite

    Donata Leskauskaite

    Author, BoredPanda staff

    Read more »

    Hey there! I'm a Visual Editor in News team. My responsibility is to ensure that you can read the story not just through text, but also through photos. I get to work with a variety of topics ranging from celebrity drama to mind-blowing Nasa cosmic news. And let me tell you, that's what makes this job an absolute blast! Outside of work, you can find me sweating it out in dance classes or unleashing my creativity by drawing and creating digital paintings of different characters that lives in my head. I also love spending time outdoors and play board games with my friends.

    Read less »

    Donata Leskauskaite

    Donata Leskauskaite

    Author, BoredPanda staff

    Hey there! I'm a Visual Editor in News team. My responsibility is to ensure that you can read the story not just through text, but also through photos. I get to work with a variety of topics ranging from celebrity drama to mind-blowing Nasa cosmic news. And let me tell you, that's what makes this job an absolute blast! Outside of work, you can find me sweating it out in dance classes or unleashing my creativity by drawing and creating digital paintings of different characters that lives in my head. I also love spending time outdoors and play board games with my friends.

    What do you think ?
    Add photo comments
    POST
    Kylie
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    The mother is suing? How about she took the time to monitor her son's online activities better?

    Becky Samuel
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    Sod his online activities, how about making sure that your 14 year old kid doesn't have easy access to a handgun!?

    Load More Replies...
    Ace
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    I fail to see how the AI Bot played any part in this. The kid had mental health problems, perhaps unrecognised, certainly unaddressed and was able to get hold of a gun for an easy way to do himself in. Tragic, of course, but not the fault of the AI. The lawsuit will fail.

    R.C.
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    This is terribly tragic but there were SO many warning signs! Where were the parents when he started isolating himself, withdrawing into his phone, started struggling in school, etc. I don't think the AI Bot company is to blame at all but I can see how in the parent's grief, they would want someone else to blame. The alternative, is acknowledging that they missed the signs and messed up by providing easy access to a gun and that would be incredibly painful.

    Load More Comments
    Kylie
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    The mother is suing? How about she took the time to monitor her son's online activities better?

    Becky Samuel
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    Sod his online activities, how about making sure that your 14 year old kid doesn't have easy access to a handgun!?

    Load More Replies...
    Ace
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    I fail to see how the AI Bot played any part in this. The kid had mental health problems, perhaps unrecognised, certainly unaddressed and was able to get hold of a gun for an easy way to do himself in. Tragic, of course, but not the fault of the AI. The lawsuit will fail.

    R.C.
    Community Member
    1 month ago DotsCreated by potrace 1.15, written by Peter Selinger 2001-2017

    This is terribly tragic but there were SO many warning signs! Where were the parents when he started isolating himself, withdrawing into his phone, started struggling in school, etc. I don't think the AI Bot company is to blame at all but I can see how in the parent's grief, they would want someone else to blame. The alternative, is acknowledging that they missed the signs and messed up by providing easy access to a gun and that would be incredibly painful.

    Load More Comments
    You May Like
    Related on Bored Panda
    Related on Bored Panda
    Trending on Bored Panda
    Also on Bored Panda