The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”

“I miss you, baby sister,” he wrote.

“I miss you too, sweet brother,” the chatbot replied.

Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

  • RunningInRVA@lemmy.world
    link
    fedilink
    English
    arrow-up
    73
    arrow-down
    2
    ·
    29 days ago

    He put down his phone, picked up his stepfather’s .45 caliber handgun and pulled the trigger.

    A tragic story for sure, but there are questions about the teen’s access to the gun he used to kill himself.

    • southsamurai@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      28 days ago

      Yeah, that’s not on the app/service.

      Could the kid have found another way? Absolutely. But there’s a fucking reason guns stay locked up and out of access for minors, even if that means the adults can’t access them quickly. Kids literally can’t exert full self inhibition of urges, so you make damn sure that anything as easy to make horrible impulse decisions with is out of their hands.

      Shit, my kitchen knives stay in a locked case. Same with dangerous chemicals. There’s a limit to how much you can realistically compartmentalize and keep locked up, but that limit isn’t hard to achieve to the degree that nobody can reach things on impulse. Even a toolbox with a padlock on it is enough to slow someone down and give their brain a chance to inhibit the impulse.

      My policy? If the gun isn’t on my person, it’s locked up in a way that can only be accessed by the people I want to access it. Shit, even my pellet guns stay in the main safe. The two that are available for the other adults are behind fingerprint locks. Even my displayed collection of knives is locked up enough to prevent casual impulses.

      I’m not trying to shit on the parents here, but it isn’t hard to keep a firearm locked up and still accessible to the owner rapidly. Fingerprint safes and locks have been around long enough that the bugs are worked out. They’re not cheap, but if you can afford a firearm in the first place, you can damn well afford keeping it out of someone else’s hands without your permission or a lot of hassle.

    • dirthawker0@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      28 days ago

      Safe? Clearly no. Trigger lock? Cable lock? If one were there, there should be a mention of picking it or cutting it. Unloaded? Also clearly no.

      There are so many ways, any of which take a whole 20 seconds, the parents could have used to prevent this from happening.

    • j4k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      28 days ago

      What kind of monster family had a kid with mental health issues, in therapy, and has an accessible gun around unsupervised?

  • Dagamant@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    29 days ago

    I don’t think this is the fault of the AI yet. Unless the chat logs are released and it literally tries to get him to commit. What it sounds like is a kid who needed someone to talk to and didn’t get it from those around him.

    That said, it would be good if cAI monitored for suicidal ideation though. Most of these AI companies are pretty hands off with their AI and what is said.

  • Nuke_the_whales@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    3
    ·
    28 days ago

    I’m sorry to say but sounds like the parents ignored this issue and didn’t intervene or get their son help. I don’t see how this is the apps fault, if anything it sounds like this app was being used by him as some form of comfort and if anything, kept him going a little longer. Sadly this just sounds like parents lashing out in their grief

    • Dog@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      28 days ago

      From what I heard, the parents did get the kid a therapist, but it just didn’t work :(