Outsourced and out of sight, content moderators call on Big Tech to address the mental trauma of content moderation. — Reuters
BRUSSELS: Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online.
The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts.
