TikTok content moderators sue over exposure to disturbing content

PHOTO: TikTok content moderators sue for trauma. (via TikTok)

Maybe it’s an “Okay Boomer” thing, but many of our readers may find scrolling endlessly through the short and often obnoxious videos that make up the TikTok app somewhat nauseating. But if you think what’s published on there is bad, the millions of videos that don’t make the cut are far, far worse. That content is blocked by a team of about 10,000 content moderators who are now suing TikTok for the frantic pace that they’re exposed to disturbing and often criminal videos.

In a lawsuit filed against TikTok parent company ByteDance, employees who work in content moderation describe their 12-hour shifts where they watch hundreds of videos non-stop with only two 15-minute breaks and a one-hour lunch break. The list of horrors it described having come across in their moderation goes beyond even what you might think someone would attempt to upload for the public to view.

Employees report frequent run-ins with fights and violence, rape, child pornography, animal mutilations, executions, beheadings, suicides, school shootings, cannibalism, and brutal death like crushed heads and falls from buildings. One overwhelmed content monitor has proposed a class-action lawsuit over the trauma she’s been exposed to working for TikTok.

As the saying goes, it’s a dirty job but someone’s got to do it. But the feverish pace and turnover expected by these moderators mean that they have a maximum of 25 seconds before having to jump to the next video and they’re often monitoring multiple videos at a time with displays showing 3 or as many as 10 videos at once. (One imagines the disturbing scenes of Alex’s “aversion therapy” in A Clockwork Orange.)

Related news

TikTok joined a group of social media companies like Facebook and YouTube that recognize the problem that in order to keep their users from seeing this disturbing content, someone is often forced to see it and catch it. The group has developed guidelines to help employees cope with images such as child abuse that their role as a content moderator constantly exposes them to.

But the lawsuit alleges that TikTok did not enact these guidelines which call for a limit of 4 hours for content moderation shifts, and psychological support provided for those who feel traumatized. The woman who brought the suit says she suffers from post-traumatic stress disorder from reviewing so much disturbing content just from her Las Vegas home. The complaint says that she has terrifying nightmares related to the content she’s been exposed to, if she can even sleep at all.

TikTok has not issued a response to the allegations and to the pending lawsuit. The person who filed suit intends to expand to represent more content moderator at the company. The suit will ask the court to order that TikTok sets up a medical fund are affected moderators and include compensation for psychological injuries.

SOURCE: Bangkok Post

Technology NewsWorld News

Neill Fronde

Neill is a journalist from the United States with 10+ years broadcasting experience and national news and magazine publications. He graduated with a degree in journalism and communications from the University of California and has been living in Thailand since 2014.

Related Articles