'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma


Outsourced and out of sight, content moderators call on Big Tech to address the mental trauma of content moderation. — Reuters

BRUSSELS: Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online.

The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Windows running slow? Microsoft’s 11 quick fixes to speed up your PC
Meta to let users in EU 'share less personal data' for targeted ads
Drowning in pics? Tidy your Mac library with a few clicks
Flying taxis to take people to London airports in minutes from 2028
Smartphone on your kid’s Christmas list? How to know when they’re ready.
A woman's Waymo rolled up with a stunning surprise: A man hiding in the trunk
A safety report card ranks AI company efforts to protect humanity
Bitcoin hoarding company Strategy remains in Nasdaq 100
Opinion: Everyone complains about 'AI slop,' but no one can define it
Google faces $129 million French asset freeze after Russian ruling, documents show

Others Also Read