TikTok sued by content moderator disturbed by graphic videos


TikTok is being sued by its content moderators, who claim the frenetic pace of censoring horrific videos has left them traumatised. — AP Photo

TikTok’s 10,000 content moderators are exposed to a regular diet of child pornography, rapes, beheadings and animal mutilation, according to a lawsuit filed against the video-sharing platform and its parent, ByteDance Inc.

It gets worse. Content moderator Candie Frazier says in her proposed class-action lawsuit that she has screened videos involving freakish cannibalism, crushed heads, school shootings, suicides, and even a fatal fall from a building, complete with audio.

And there’s no escaping it, Frazier claims. TikTok requires moderators to work at a frantic pace, watching hundreds of videos per 12-hour shift with only an hour off for lunch and two 15-minute breaks, according to Thursday’s complaint in federal court in Los Angeles.

"Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time,” her lawyers said in the complaint.

TikTok said it doesn’t comment on ongoing litigation, but strives "to promote a caring working environment for our employees and contractors.”

"Our safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” a company spokesperson said in a statement.

TikTok was a member of a group of social media companies including Facebook and YouTube that developed guidelines for helping moderators cope with the images of child abuse that their jobs required them to view, according to the complaint.

But TikTok failed to implement the guidelines, which include providing psychological support and limiting shifts to four hours, according to the suit.

Frazier, who lives in Las Vegas, said she suffers from post traumatic stress disorder as a result of all the disturbing videos she has had to watch.

"Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmares,” according to the complaint.

Frazier, who seeks to represent other TikTok content screeners, is asking for compensation for psychological injuries and a court order requiring the company to set up a medical fund for moderators. – Bloomberg

Article type: metered
User Type: anonymous web
User Status:
Campaign ID: 1
Cxense type: free
User access status: 3
Join our Telegram channel to get our Evening Alerts and breaking news highlights
   

Next In Tech News

Meet Imagen, Google’s new AI that turns text into images
Booking hotels online: Why you can never trust reviews 100%
Dyson shows off robotic arm that can find crisps at back of your sofa
Four years on, is the GDPR a pain or a forerunner in data protection?
Research: Google Chrome only blocking a quarter of phishing websites
Google Street View’s ‘time travel’ feature comes to smartphones
Meta eyes profits with new WhatsApp platform for businesses
Amazon investor proposal to review plastic use narrowly fails to clear
Transport Canada probing cause of Tesla fire in Vancouver
Binance registers with Italy's regulator amid plans to expand in Europe

Others Also Read