0 like 0 dislike
in General Factchecking by Newbie (450 points)
A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home". Artificial Intelligence is indeed a powerful tool but when something is too powerful it can't always be good. It is indeed a legit story and everyone would assume a story like this would be made up but it is not and it's just a depressing one. A lot of new sites are covering and it shows the negative side of AI.

7 Answers

0 like 0 dislike
by Apprentice (1.3k points)
selected by
 
Best answer

Sadly, this is true. The mother of the child is now suing the website "Character.ai" for the death of her son (Reuters). The chat logs were released, and while the content is sexual in nature, the AI chatbot never explicitly gave instructions. The NY Times wrote a solid article detailing everything and establishing the connection. (NY Times)
The company put out a message on twitter regretting the incident and provided a link to their safety policies. (X.com)

True
0 like 0 dislike
by Novice (940 points)
The claim is true that the boy, Sewell Setzer III, killed himself, but not solely because of A.I. In the New York Times post, it stated that "Sewell was diagnosed with mild Asperger’s syndrome as a child, but he never had serious behavioral or mental health problems before". This case can't just be blamed on A.I because multiple factors went in to lead up to that point. Nonetheless, this is a very sad case but the claim is a bit misleading.

https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html
Exaggerated/ Misleading
by (180 points)
0 0
I appreciate the fact you included the mental health aspect of this case. Its important to note things like this instead of believing everything you see.
1 like 0 dislike
by Apprentice (1.3k points)

This story unfortunately seems true. The story is being covered by many other news networks (New York Times: https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html). The story is further backed up by the fact that Character.AI posted a note on X (formerly Twitter) stating that they are, "heartbroken by the tragic loss of one of our users."

Twitter link: https://x.com/character_ai/status/1849055407492497564

0 like 0 dislike
by Novice (920 points)

As the original post says, this story is unfortunately legit, with the AI company, Character Ai, putting out a tweet not only confirming this but also giving their condolences to the family but there is more to the story than just the AI leading Sewell Setzer III to take his own life. Although Setzer's connection to the chatbot may not have helped his condition and definitely played a role in pushing him away from therapy that could have helped him, he clearly had other problems in his life. His mother told the New York Times that Setzer had mild Aspbergers and was recently diagnosed with Anxiety and disruptive mood dysregulation disorder, which the Cleveland Clinic says could cause "significant issues in school, at home, and in social relationships" and it could also lead to depression, this lines up at least partially with what his mother was describing. Again Setzer's connection to the AI chatbot definitely played a large role in his taking his own life but I feel it is misleading to say it caused it completely rather than acting as a roadblock to getting the proper care he needed which in combination with the other problems he was facing at the time all combining together to cause this tragic moment.

https://my.clevelandclinic.org/health/diseases/24394-disruptive-mood-dysregulation-disorder-dmdd

https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html

https://x.com/character_ai/status/1849055407492497564

Exaggerated/ Misleading
0 like 0 dislike
by Newbie (320 points)
This is for the most part a true claim. All of the things stated in the claim did indeed happen. Some people are considering it exaggerated because there were possibly other factors like his "mild Aspergers" diagnosis and anxiety problems which can lead to depression, but this claim isn't saying that AI is the sole and only cause to this child's death. Just that this chatbot was definitely related and an influence on the suicide, which it was.

(https://nypost.com/2024/10/23/us-news/florida-boy-14-killed-himself-after-falling-in-love-with-game-of-thrones-a-i-chatbot-lawsuit/)
True
0 like 0 dislike
by Novice (600 points)

The New York Post claims that after messaging a “Game of Thrones” chatbot for months 14-year-old Sewell Setzer III committed suicide. While the New York Post is not considered necessarily reliable, reputable outlets such as The New York Times and The Washington Post have also published articles with similar claims. The New York Post’s article includes photo evidence of the messages between the chatbot and Setzer that allude to a romantic interest pursued by Setzer. While there are other claims circulating online stating that the chatbot explicitly told to Setzer to commit suicide, they are false. The evidence that makes The New York Post’s original claim true is in the photographic proof obtained by the US District Court where the chatbot urges Setzer to “come home to [her] as soon as possible” (The NewYork Post). 

True
0 like 0 dislike
by Newbie (300 points)

Unfortunately, this claim is true. I looked up more credible sources and the New York Times has published a report on it. (https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.html) Almost every single claim they made in the article linked is true and did happen. Photo evidence is included to show what exactly was said and what it lead to.

True

Community Rules


Be respectful.

There is bound to be disagreement on a site about misinformation. Assume best intentions on everyone's part.

If you are new to factchecking, take some time to learn about it. "How to Factcheck" has some resources for getting started. Even if you disagree with these materials, they'll help you understand the language of this community better.

News Detective is for uncovering misinformation and rumors. This is not a general interest question-answer site for things someone could Google.

Posting

The title is the "main claim" that you're trying to factcheck.

Example:
Factcheck This: Birds don't exist

If possible, LINK TO to the place you saw the claim.

Answering

LINK TO YOUR EVIDENCE or otherwise explain the source ("I called this person, I found it in this book, etc.")

But don't just drop a link. Give an explanation, copy and paste the relevant information, etc.

News Detective is not responsible for anything anyone posts on the platform.
...