UK tells tech firms to 'tame algorithms' to protect children


FILE PHOTO: Printed Facebook and TikTok logos are seen in this illustration taken February 15, 2022. REUTERS/Dado Ruvic/Illustration/File Photo

LONDON (Reuters) - Social media platforms like Facebook, Instagram and TikTok will have to "tame" their algorithms to filter out or downgrade harmful material to help protect children under proposed British measures published on Wednesday.

The plan by regulator Ofcom is one of more than 40 practical steps tech companies will need to implement under Britain's Online Safety Act, which became law in October.

Uh-oh! Daily quota reached.


Experience an ad-free unlimited reading on both web and app.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

   

Next In Tech News

AI echo chambers: Chatbots feed our own bias back to us, study finds
Musk, Indonesian health minister, launch Starlink for health sector
Apple brings eye tracking to iPhone and iPad in accessibility update
What do Google’s AI updates mean for everyday users?
Preview: ‘MechWarrior 5: Clans’ takes a more cinematic approach to its giant robot campaign
Britain's M&S apologises after website and app hit by 'technical issue'
Honey, I love you. Didn’t you see my Slack about it?
The architects of ‘Hades’ strive to bewitch gamers again
A pithy YouTube celebrity’s plea: Buy this video game
Coming soon: Control your smartphone with facial expressions

Others Also Read