Facebook failed to remove reported extremist posts: The Times


  • TECH
  • Thursday, 13 Apr 2017

Cross checking: Facebook said it would work with several leading French news organisations to ensure that false news items were not published on its platform. — Reuters

LONDON: Facebook failed to remove dozens of instances of extremist and child pornography even after the social network's moderators were directly informed of the potentially illegal content, an investigation by The Times showed on April 13.

Using a fake profile set up last month, a Times journalist found images and videos glorifying Islamic State and recent deadly attacks in London and Egypt, along with graphic images of child abuse, and asked site moderators to remove them.

Facebook moderators removed some of reported images but left untouched pro-jihadist posts praising recent attacks and calling for new ones. The company appeared to take action only after The Times identified itself as reporting a story on the matter.

Failure to remove content which is illegal under British law after company officials have been notified of its existence could expose Facebook to criminal prosecution for its role in encouraging the publication and distribution of such imagery.

The social media giant faces new laws in countries around the world to force it to move faster to combat illegal content, but it has struggled to keep pace as illicit posts can reappear as fast as they are identified and taken down.

A Facebook spokesman said the company had now removed all the images identified by The Times as potentially illegal, acknowledging that they "violate our policies and have no place on Facebook".

"We are sorry that this occurred," Facebook Vice President of Operations Justin Osofsky said in a statement. "It is clear that we can do better, and we'll continue to work hard to live up to the high standards people rightly expect of Facebook.”

A spokesman for London's Metropolitan Police called for individuals to report extremist content to it via an online form. It declined to comment on whether it was investigating if Facebook failed to act when notified of the illegal content.

"Where material breaches UK terrorism laws, the Counter Terrorism Internet Referral Unit (CTIRU) will, where possible, seek the removal of the content by working with the relevant Internet hosting company," the spokesman said. — Reuters

Article type: metered
User Type: anonymous web
User Status:
Campaign ID: 1
Cxense type: free
User access status: 3
Join our Telegram channel to get our Evening Alerts and breaking news highlights
   

Next In Tech News

Restaurant-software maker Toast valued at nearly $33 billion as shares surge in debut
Apple bars Epic's "Fortnite" until all court appeals end
Salesforce rival Freshworks valued at over $12 billion in debut
Facebook warns it is 'underreporting' iOS ad results amid Apple privacy changes
Can you 'own' a goal? Collectible NFTs rolling into elite soccer
Robinhood to begin testing crypto wallets, with broader launch in early 2022
Bill Gates' green tech fund bets on Silicon Valley farming robots
Ford, Redwood form 'circular' supply chain for EV battery materials
Electric Last Mile secures 1,000 units order for urban delivery van
EU court backs Altice's million-euro fine in gun-jumping merger deal

Stories You'll Enjoy


Vouchers