'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma


Outsourced and out of sight, content moderators call on Big Tech to address the mental trauma of content moderation. — Reuters

BRUSSELS: Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online.

The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

In NYC classes, teachers can use AI to plan but not to assign grades
Google top India counsel quits in latest departure amid regulatory hurdles, sources say
Uber, Pony.ai and Verne team up to launch Europe's first robotaxi service in Croatia
The EU’s biggest test for device makers: Replaceable batteries
US activists work to connect Iranians via Starlink
New on the iPhone: Shazam songs even when offline with iOS 26.4
First Robot: Melania Trump brings droid to White House event
Why AI means animal testing is not always needed to trial new medicines
Day of reckoning arrives for social media after US court loss
Teens get probation after using AI to create fake nudes of classmates

Others Also Read