0 like 0 dislike
in General Factchecking by Newbie (240 points)
Tiktok may be sending very harmful videos related to suicide and eating disorders to teens, within minutes of creating their accounts. There is a lot of content related to body disorders and mental health on Tiktok, both negative and positive things.

3 Answers

0 like 0 dislike
by Newbie (460 points)
selected by

My research on this topic suggests that the study referenced in this article is flawed, and if they want to make a study with high validity, they need to design it differently. Reading the study that was mentioned in this article, it’s clear that there are multiple distinct mistakes within their presentation of their findings. First of all, the study states, “Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content related to eating disorders.” In no way does this finding claim that the content being served to the account was negative, or encouraging of eating disorders, or mental health disorders. As stated in the article, “The spokesperson said the CCDH does not distinguish between positive and negative videos on given topics, adding that people often share empowering stories about eating disorder recovery.” This is a very important distinction, because the study is claiming that TikTok is pushing harmful content. They did not mention any “harmful” content, which is a purposeful use of words to push the rhetoric that they are looking for. In addition, the researchers admit that, “TikTok operates through a recommendation algorithm that constructs a personalized endless-scroll ‘For You’ feed, ostensibly based on the likes, follows, watch-time, and interests of a user.” In a previous paragraph, the researchers stated that the accounts they made impersonating young teens would stay on eating disorder, self harm, or suicide content for a few seconds, and like and interact with only those posts. It is strange and contradictory that they are explaining how the algorithm works, and then baiting the algorithm to give them the “negative” result. They continue to say, “TikTok identifies the user’s vulnerability and capitalizes on it.” Which we have already explained, is just the algorithm doing what it does… Being an algorithm. In addition, the studies were short, as the article claims, and to my research have not been replicated.

The article was written by Samantha Murphy Kelly, who writes about how technology can impact lives. There are no obvious biases associated with her, other than that CNN is a media platform, and she and the company will likely NOT benefit from media slander such as the referenced study above. As for the CCDH, they make money from producing studies, and claiming to "counter digital hate" Hence the name. It seems as though the writers of the study and the researchers weren't too thorough.

CCDH study link:https://counterhate.com/wp-content/uploads/2022/12/CCDH-Deadly-by-Design_120922.pdf

CNN article link: https://www.cnn.com/2022/12/15/tech/tiktok-teens-study-trnd

Exaggerated/ Misleading
0 like 0 dislike
ago by Newbie (300 points)

The claim being made is wildly exaggerated. While harmful content is all over the internet and can be accessed by teens, it isn’t pushed onto them automatically the moment they create an account on TikTok, more often than not they’re seeking it out. Per the original article, “These accounts briefly paused on and liked content about body image and mental health. The CCDH said the app recommended videos about body image and mental health about every 39 seconds within a 30-minute period”, the research being conducted is completely skewed in one direction to fit their narrative. The additive nature of the TikTok algorithm is inherently flawed and should be reviewed, and restricted, but the videos pushed to the viewer are based on their previous likes, comments, shares, and even lingering on a video for a second longer than others. So coaching the feed to show content related to suicide and eating-disorders taints the logistics of the study. Furthermore, the study does not indicate whether or not the content on dieting or eating habits are positive or negative as there is a large community of survivors who share their experiences for empowerment and to prevent others from following their paths.

Sources:

https://www.cnn.com/2022/12/15/tech/tiktok-teens-study-trnd

Exaggerated/ Misleading
0 like 0 dislike
ago by Newbie (310 points)
I do believe that TikTok may push potentially harmful content onto teens within minutes because TikTok's algorithm can quickly increase information linked to detrimental habits, such as sever dieting, self-harm, or substance usage. Teens may be exposed to dangerous or inappropriate content soon after joining or interacting with specific videos due to the platform's design, which places emphasis on participation. Teen safety and mental health are legitimate issues raised by this quick form content.
True

Community Rules


• Be respectful
• Always list your sources and include links so readers can check them for themselves.
• Use primary sources when you can, and only go to credible secondary sources if necessary.
• Try to rely on more than one source, especially for big claims.
• Point out if sources you quote have interests that could affect how accurate their evidence is.
• Watch for bias in sources and let readers know if you find anything that might influence their perspective.
• Show all the important evidence, whether it supports or goes against the claim.
...