Facebook wants users’ responses to improve feeds

Facebook in recent months has been de-emphasising political content which tends to be more polarising and moving to give users more control over their feeds. — AFP

SAN FRANCISCO: Facebook said on April 22 it would emphasise user feedback when prioritising posts on the leading social network, the latest move to quell concerns over its algorithms.

The California giant said it would add weight to surveys asking users if certain messages are “worth your time” as part of its ranking process for its main news feeds.

“Our algorithm uses thousands of signals to rank posts for your News Feed with this goal in mind,” said a blog post from Aastha Gupta, Facebook product management director.

“This spring, we’re expanding on our work to use direct feedback from people who use Facebook to understand the content people find most valuable. And we’ll continue to incorporate this feedback into our News Feed ranking process.”

The move comes with Facebook and other online platforms under pressure over opaque algorithms which determine what users see at the top of their feeds.

Critics say these systems may be geared to highlight sensational or divisive content, aiming to keep users engaged to boost monetisation.

Facebook in recent months has been de-emphasising political content which tends to be more polarising and moving to give users more control over their feeds.

In March, Facebook unveiled a change to give users more control over their News Feed and even to turn off the Facebook algorithm entirely and see posts in chronological order.

The latest tweak aims to use the surveys asking “Is this post worth your time?” to prioritise content in Facebook’s ranking algorithm.

“While a post’s engagement – or how often people like it, comment on it, or share it – can be a helpful indicator... this survey-driven approach, which largely occurs outside the immediate reaction to a post, gives a more complete picture of the types of posts people find most valuable,” said Gupta.

“Now we’re building on these surveys by asking new questions about the content people find valuable as well as the content people don't enjoy seeing in their News Feed.”

Facebook’s move away from divisive political content gained momentum after the Jan 6 US Capitol riot which was organised in part on social media.

After an internal review, Facebook acknowledged that it failed to do enough to prevent the circulation of the #StopTheSteal movement that led to the violence.

“We took a number of steps to limit content that sought to delegitimise the election,” a Facebook spokesperson said after a BuzzFeed report on the review.

“As we’ve said previously, we still saw problematic content on our platform during this period and we know that we didn’t catch everything.” – AFP

Article type: free
User access status:
Subscribe now to our Premium Plan for an ad-free and unlimited reading experience!

Algorithms , news feed


Next In Tech News

France to ban TikTok on work phones of civil servants -minister
UK regulator drops some competition concerns in Microsoft-Activision deal
Binance halts deposits, withdrawals on glitches in spot trading
Databricks pushes open-source chatbot as cheaper ChatGPT alternative
Venture capitalists race to land next AI deal on Big Tech's turf
Chinese smartphone maker Xiaomi's fourth-quarter revenue and profit slides
Joby Aviation names former FAA administrator as director
WHO warns of 'fake news' after Musk pandemic treaty tweet
Musk denies report on SpaceX's plans for new funding from Saudi, UAE
Here are the countries that have bans on TikTok

Others Also Read