Research: Facebook's algorithm doesn't alter people's beliefs


Facebook's algorithm, which uses machine-learning to decide which posts rise to the top of users' feeds based on their interests, has been accused of giving rise to "filter bubbles" and enabling the spread of misinformation. — Photography OLIVIER DOULIERY/AFP

Do social media echo chambers deepen political polarization, or simply reflect existing social divisions?

A landmark research project that investigated Facebook around the 2020 US presidential election published its first results Thursday, finding that, contrary to assumption, the platform's often criticized content-ranking algorithm doesn't shape users' beliefs.

Play, subscribe and stand a chance to win prizes worth over RM39,000! T&C applies.

Monthly Plan

RM 13.90/month

RM 11.12/month

Billed as RM 11.12 for the 1st month, RM 13.90 thereafter.

Best Value

Annual Plan

RM 12.33/month

RM 9.87/month

Billed as RM 118.40 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Next In Tech News

Could your phone be affecting your skin? Dermatologists explain
AI is coming for the sommeliers
Happiness Report says it is better to be social than on social media
After K-pop and K-drama, here come K-games
Explainer-What is the World Trade Organization e-commerce moratorium?
More! More! More! Tech workers max out their AI use.
Meta's longtime content policy chief Bickert leaving to teach at Harvard
Coming of age: Mega Cat Studios releases new 'God of War' video game
AI agents: They’re fun. They’re useful. But don’t give them the credit card.
Scientists use saliva for non-invasive, AI-based Parkinson's test

Others Also Read