Facebook's algorithm, which uses machine-learning to decide which posts rise to the top of users' feeds based on their interests, has been accused of giving rise to "filter bubbles" and enabling the spread of misinformation. — Photography OLIVIER DOULIERY/AFP
Do social media echo chambers deepen political polarization, or simply reflect existing social divisions?
A landmark research project that investigated Facebook around the 2020 US presidential election published its first results Thursday, finding that, contrary to assumption, the platform's often criticized content-ranking algorithm doesn't shape users' beliefs.
Already a subscriber? Log in
Save 30% OFF The Star Digital Access
Cancel anytime. Ad-free. Unlimited access with perks.
