PHILADELPHIA: Facebook is an important tool for Akin Olla, a West Philadelphia socialist activist. He's used the platform to organise protests outside statehouses, promote his podcast on revolutions, and share his columns criticising President Joe Biden. He’s the social media manager for a left-leaning nonprofit that trains activist groups.
But Olla, 30, found that vast platform used against him a few weeks ago. As Facebook grappled with the fallout from the deadly Jan 6 riot at the US Capitol, the company restricted Olla’s ability to communicate with others. It was part of Facebook’s widespread security measures to prevent violence on Inauguration Day.
It remains unclear why Facebook’s action swept up Olla. He regained full access to Facebook after 10 days, but the experience left him uneasy. He was hesitant to write his next column for the British daily the Guardian about racism at the FBI, believing his activism had drawn Big Brother-like attention.
“It just feels weird to know that Facebook has me on some sort of list,” Olla said.
While bitter complaints over “Big Tech” and its “censorship” have become a staple of conservative media, Olla’s example shows that left-wing social-media users have been targeted, too.
Companies such as Facebook are facing pressure to moderate content that’s extremist, inaccurate, and inappropriate. At the same time, others worry such restrictions will limit free speech on the privately owned platforms that have effectively become online public squares. Caught in the middle are users who think their accounts are mistakenly flagged as dangerous, and are unnerved for being targeted.
“It’s both scary how much power Facebook has and what gravity its decisions have,” said David Greene, civil liberties director for the Electronic Frontier Foundation (EFF), a digital rights group that has criticised Facebook. “But it’s also a tremendous tool at the same time. So I think the policy challenge is, how do you preserve the tool without vesting so much power in someone you don’t necessarily trust?”
The issue reached a flashpoint last month when Twitter banned Donald Trump, then president, after the insurrection. Social media companies have recently purged their platforms of right-wing conspiracy theorists and some of Trump’s allies, including the CEO of MyPillow. Facebook’s moderation policies came under scrutiny again last week when it said it mistakenly removed some left-wing accounts, including the Socialist Workers Party.
Less attention was given to Facebook’s actions on an untold number of less notable users. To prevent violence on Inauguration Day, the company said it blocked the creation of events near the White House, US Capitol, and statehouses, according to a Facebook blog post. The company barred anyone outside the US from organising any events in the country.
Facebook temporarily restricted some Americans, too, based on “signals”, such as repeat violations of the company’s policies, the blog post said. Those users couldn’t create live videos or make Facebook groups, events, or pages. Olla said he couldn’t communicate in existing pages or groups, either. He still had access to other features, such as sharing photos or messaging individual users.
Olla had no previous policy violations, according to a message he received from Facebook. Yet the company wouldn’t consider his appeal, he said. A Facebook spokesperson initially told The Inquirer that restricting Olla’s account was a mistake, then said the action was correctly taken, but didn’t explain why.
Facebook spokesperson Kristen Morea said the security measures reflected “signals” and that Facebook’s action were taken “at scale”. But she declined to say how many users were restricted or what other “signals” Facebook used to identify potentially risky accounts. She initially said some people may have been mistakenly restricted, before saying the action on Olla’s account was correct. She declined to discuss Olla’s case in detail.
A computer algorithm tasked with finding troubling accounts could use a person’s Facebook friends or the pages they follow as data points, said Jason Thatcher, a Temple University professor of management information systems. An algorithm could also look for words or phrases to predict violence, but they are still far from perfect. A phrase like “battleground state” would likely be deemed innocuous, for example. “Make this state a battleground,” however, could be concerning, Thatcher said.
“They try to build the algorithms to be able to detect those nuances, and usually that requires a person to help it,” he said. “But again, Facebook was under lots of pressure to act fast.”
Olla thinks Facebook may have flagged him for his left-wing views calling for radical change in the US government. Olla’s page is filled with references to “revolution”, criticisms of Biden, and at least one post suggesting that riots – when paired with peaceful protests – can cause effective social change. That post followed the looting that occurred last summer in the wake of the police killing of George Floyd.
“It’s pretty absurd,” Olla said, noting that he trains activist leaders in nonviolent protest strategies and didn’t want to his employer to be publicly identified so it could be attacked online. “The idea of me interfering with the inauguration in that fashion just doesn’t really make sense.”
Olla’s comments certainly didn’t attempt to incite imminent lawless action – speech that isn’t protected by the First Amendment, according to legal experts. But the Constitution’s protections apply to government actions, not private ones. The widely used social media platforms are acting under a less speech-protective standard than the constitutional one, said Amanda Shanor, a Wharton assistant professor of legal studies and business ethics.
“As a private company, at least under current First Amendment law, Facebook probably can do almost anything that it wants” in terms of regulating speech on its website, she said.
Free speech and tech experts said Facebook was likely sifting through enormous amounts of data quickly to prevent a repeat of the deadly riot in Washington. In normal circumstances, moderators could carefully review accounts before taking action. And the company may be feeling pressure from advertisers and government regulators, who are already scrutinising the social media giants.
“I think Facebook made a really good decision to put a pause,” Temple’s Thatcher said. “Slowing down the speed at which the conversations occur... and slowing down the speed at which dissemination occurs, encourages what we call slow thinking, which is more rational.”
However, Facebook is not consistently and clearly communicating the logic and policies driving its technical solutions, Thatcher said. Greene, of the EFF, said Facebook should be more transparent about its general moderation practices and with users about their specific cases.
The company’s new oversight board last week found Facebook’s moderation policies were poorly communicated to the public and overturned moderators’ decisions in four of five cases it reviewed. The cases covered topics ranging from users incorrectly quoting the Nazi Joseph Goebbels to a breast cancer campaign that showed pictures of women’s breasts.
Olla regained full access to his Facebook the same day The Inquirer asked about his case, though the company said the temporary restrictions were already scheduled to end that day. Over the coming months, the company’s oversight board will consider Trump’s Facebook suspension, which remains indefinite. The public has until Feb 8 weigh in. – The Philadelphia Inquirer/Tribune News Service
Did you find this article insightful?