Facebook wants to bring the world closer together. Twitter vows to protect freedom of expression. And as Google tries to make information universally accessible, the company urges its employees not to be evil.
Now, these companies are facing a wake-up call after the US presidential election highlighted how social media, ads and search results can distort facts, promote fake news or reinforce biases.
US lawmakers are putting more pressure on the world's largest tech firms, drafting legislation that would require them – like TV stations – to disclose more information about political ads they run online, including who purchased the ads and the targeted audience.
Accused of undermining the democratic process, the tech firms are acknowledging they could have done more to prevent foreign powers and users from buying politically divisive ads and spreading misinformation. Mistrust of their platforms is growing.
"It's not just social media, but the whole Internet platform business has shifted in the last six months from good until proven otherwise to bad until proven otherwise. It's stunning," said Steven Weber, a professor at UC Berkeley's School of Information and Department of Political Science.
To some extent, the damage is done: Users are questioning whether they can trust the information on social networks and search results. Lawmakers are compelling Bay Area technology companies to testify before Congress – and then in some cases, criticising them for not taking Russia's possible interference in the US presidential election more seriously.
Facebook this week turned over more than 3,000 ads – likely linked to Russia – to congressional investigators looking into whether the country meddled in the US presidential election. The ads, which focused on divisive political issues such as immigration and gun rights, were seen by an estimated 10 million people in the United States before and after the election, the company said Monday.
Facebook said it found 470 accounts and pages linked to Russian entities that had placed the ads, and it shut them down because their identities weren't authentic.
Twitter recently met with lawmakers, revealing that it found 201 Twitter accounts linked to the Russian entities that purchased ads on Facebook.
And Google is also reportedly investigating whether Russian entities used its ads and services.
Some tech moguls even expressed disappointment in themselves.
"After the election, I made a comment that I thought the idea misinformation on Facebook changed the outcome of the election was a crazy idea," Facebook CEO Mark Zuckerberg wrote in a recent post on social media. "Calling that crazy was dismissive and I regret it. This is too important an issue to be dismissive."
But as Facebook digs deeper into the negative roles it may have played in the 2016 election, it's also facing political heat from conservatives who question the tech firm's motives.
They've accused Silicon Valley tech firms of censoring conservative content and are suspicious of CEOs who have slammed some of the Trump administration's policies.
Zuckerberg created and backs an immigration lobbying group, Fwd.us, which criticised Trump's decision to end an Obama-era policy that shielded young, undocumented immigrants from deportation. And speculation persists that Zuckerberg could run for president in 2020, even though the tech billionaire denied any intentions of doing so.
"We live in a time when nothing is apolitical," Weber said. "Particularly for large firms to try to strike a position that they're somehow above politics or completely impartial is disingenuous, but more importantly, no one believes it."
After Facebook turned over ads to congressional investigators, Trump took aim at the company on Twitter and accused the Menlo Park-based tech firm – like other media outlets – of being consistently "anti-Trump."
But some experts don't think Trump's attacks on Facebook will drive users away from the world's largest social network.
"I suspect that when Donald Trump strikes at Facebook by questioning its integrity, the concern might be that people who support Donald Trump might actually walk away from Facebook," said Bill Whalen, a research fellow at Stanford University's Hoover Institution. "I don't think people are going to walk away from social media. That's a stronger addiction than Donald Trump."
Many of Facebook's more than 2 billion users worldwide rely upon the social network not only to consume news, but to promote their work or keep in touch with families and friends.
About two-thirds of Americans get at least some of their news on social media, including Facebook, Twitter, YouTube and Snapchat, according to a study released in September by the Pew Research Centre.
Some Facebook users said they are already wary about news they see on social media, noting that students should be taught in school to check the source of the information they're consuming.
Lauren Wilson, a 21-year-old Los Altos Republican, said the news she sees on Facebook is more of a "conversation starter," making her more interested in searching for information on different topics.
"If anyone is going on Facebook to find the full truth and nothing but the truth, they're lying to themselves," she said.
Trump's criticisms did hit a nerve with Zuckerberg, who runs a company that has a mission to "bring the world closer together" and has been touring the United States.
In a Facebook post, Zuckerberg pushed back against the allegations that Facebook wants to censor conservative content. He noted that liberals also criticised the company, accusing it of helping Trump by not doing enough to stop the spread of fake news.
"Both sides are upset about ideas and content they don't like. That's what running a platform for all ideas looks like," Zuckerberg wrote.
Both Facebook and Twitter said they're making changes to protect election integrity and stop the spread of misinformation.
Facebook said that in the coming months, users will be able to visit advertisers' pages and see the ads they're currently running on the social network. The tech firm is also hiring more than 1,000 people to review ads and investing more in machine learning to flag and take down ads that run afoul of its online rules.
Twitter said it will beef up its enforcement of suspicious accounts, engagement and tweets on its website.
Despite the social networks' pledges of transparency, some experts suggested that it's government oversight and regulation that will ensure tech firms follow through.
US Senator Mark Warner raised concerns that the information Twitter recently disclosed in testimony to lawmakers was "inadequate" and questioned whether the company was taking the concerns about election integrity seriously.
Warner, D-Virginia, has been working on legislation that would require digital platforms like Facebook and Google to keep a public file of certain election ads and communications. The file would include a copy of the ad, the amount it was purchased for, the number of views generated and other information.
The firms would also need to make a "reasonable" effort to ensure that election ads are not purchased indirectly or directly by a foreign national, according to a letter by Warner and Sen. Amy Klobuchar, D-Minnesota.
"No one is untouchable when the political winds really shift," Weber said. "I think in some ways the rhetoric that the firms have created around their social good mission makes them more vulnerable." — San Jose Mercury News/Tribune News Service