LONDON (Reuters) - Social media platforms like Facebook, Instagram and TikTok will have to "tame" their algorithms to filter out or downgrade harmful material to help protect children under proposed British measures published on Wednesday.
The plan by regulator Ofcom is one of more than 40 practical steps tech companies will need to implement under Britain's Online Safety Act, which became law in October.
Uh-oh! Daily quota reached.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!