TikTok: THE NIGHTMARE of Moderators Analyzing Rapes, Crimes, or Acts of Genocide

TikTok Moderators' NIGHTMARE Analyzes Rapes Crimes Acts of Genocide

TikTok is one of the largest social media platforms in the world, with over 1 billion users who post hundreds, over hundreds, of millions of short videos annually, and a team of moderators has to sift through thousands of rape videos. minors, with acts of cannibalism, and not only that.

A recent lawsuit filed against Bytedance, the company that owns TikTok, accuses the company of forcing its moderators to live a daily nightmare because of the toxic videos they are forced to review daily before they appear on the platform. social media platform.

TikTok sanctions the moderators who do not carefully analyze the videos uploaded by the users of the platform to discover the materials that contain sexual acts with minors, acts of violence, acts of cannibalism, or various types of crimes, in order to prevent their publication on the platform .

The extremely toxic work environment in which the TikTok moderators are forced to work has generated the appearance of diseases such as post-traumatic stress, a condition usually found in the military who carry out missions in war zones, but in the case of the moderators it is about the constant viewing of some materials violent videos.

The viewing of thousands of videos with multiples, rapes, violence against people and animals, plus various types of extremely violent acts, for up to 12 hours a day, forced the moderators to take Bytedance to court and demand much better working conditions.

Millions of videos like this are uploaded daily to the TikTok platform, some of them even dating back to the Myanmar genocide, and some moderators have reported experiencing extreme fatigue, trouble sleeping at night, extreme weight fluctuations, anxiety, vomiting, and other physical health problems.

“While we do not comment on pending litigation, we strive to promote a caring work environment for our employees and contractors. Our safety team collaborates with third-party firms in the critical work of helping to protect the TikTok platform and community, and we continue to expand a range of wellness services so moderators feel supported mentally and emotionally.”

TikTok sanctions moderators who do not respect their work schedule, and does not detect extremely violent content before it is published on the platform, sanctioning them with cutting percentages of their monthly salaries if they do not meet their daily verification quotas for uploaded videos.

The company defends itself by saying that it tries to take care of the physical and mental health of its employees, while relying on their work to keep away from the eyes of the world acts of violence that should not be seen by anyone, but as in the case of Facebook, the moderators are the ones who suffer the most from this process.