It is often jokingly said that the algorithm knows people better than they do themselves. According to a study, this can harm younger people in particular.
While the video app TikTok is also enjoying increasing popularity in Germany, critics repeatedly point out deficiencies in child and youth protection on the platform. A new study by Center for Countering Digital Hate (CCDH) According to the app, content that glorifies self-harm, eating disorders, and even suicide is actively promoted. “These results are every parent’s nightmare,” Imran Ahmed, executive director of the CCDH, said of the study published on Thursday.
According to Ahmed, the TikTok algorithm, which is difficult to understand from the outside, could cause considerable damage to teenagers in particular. “Instead of entertainment and security, our findings reveal a toxic environment, especially for younger TikTok users.” The Chinese parent company Bytedance has been criticized for similar findings in the past.
Diets, liposuction, self-harm
As part of the study, researchers created fictional TikTok profiles for teens in the United States, Canada, the United Kingdom, and Australia. The accounts not only shared the stated age of 13 years – the minimum age set by Bytedance – but also the fact that the supposed teenagers were soon offered videos with sometimes disturbing statements every second.
“Young people’s feeds are bombarded with harmful, upsetting content, the aggregate of which can have a significant impact on their worldview and their physical and mental health,” the report said. The so-called for-you-pages of the test profiles were full of videos in which radical diets and surgical fat reduction, but also self-injurious behavior and even suicide were touted.
The core of the problem, the researchers conclude, is above all the opaque algorithm of the platform. Its main task is to anticipate users’ viewing needs in order to present them with an endless stream of perfectly tailored content. How exactly the feeds of the users look like is largely not in their own hands.
In order to get certain videos with harmful content suggested, it is enough to linger for a few seconds on a video with a similar design – this was not only the result of the study that has now been published by the CCDH. Studies by various US media have shown that the algorithm can also contribute to the radicalization of young people and the dissemination of right-wing extremist ideas through the same mechanism.
“TikTok must make its algorithms transparent”
In view of the accumulating allegations, Bytedance refers to existing guidelines according to which harmful and violent content is prohibited on the platform. In addition, experts are regularly consulted to ensure the safety of users, as a spokeswoman for the group said Guardian communicated.
“We recognize that triggering content looks different for everyone and we remain focused on creating a safe and comfortable environment for all,” the company’s statement said.
That’s not enough for the authors of the study. “TikTok must make its algorithms fully transparent,” the researchers demanded. Otherwise, it is the task of the responsible supervisory authorities to intervene and force the platform to consistently implement the applicable guidelines. In the US state of Indiana, the Chinese Bytedance group is threatened – also for political reasons – already a lawsuit because of alleged deficiencies in youth and data protection.