The recent disclosures that Russians bought ads from Facebook, Google and Twitter to target US voters in 2016 have left lawmakers investigating how to prevent foreign interference in future elections.
But there’s another alarming problem that Congress also needs to address: how to prevent domestic and foreign organisations from duping Americans out of information they unwittingly share on social media – and using the data to try to sway elections.
During the 2016 election, the Trump campaign hired Cambridge Analytica, an American affiliate of a British consulting firm, which built psychological profiles of over 200 million Americans in part by using information they shared on social media. For example, according to The New York Times, hundreds of thousands of Americans took personality quizzes spread by the firm on Facebook, which were designed to reveal how they score on measures of the so-called “big five” personality traits. I’m willing to bet many of those test takers didn’t know their answers could be used by political consultants to profile and target them – and that they wouldn’t have taken the quizzes if they did.
Cambridge Analytica now says it didn’t use psychographic data when it worked for Trump (though reports it tried to get stolen data from WikiLeaks certainly call the firm’s ethical standards into question). But it – or anyone else – could try to do so in a future election. That’s why Congress needs to act.
Of course, it should remain legal for an organisation to post a question on social media and then list or use the aggregate results (for example, 45% of Americans believe x). But if organisations are going to extract personally identifiable information – PII, in industry parlance – from things they post on social media, they should be required to disclose three things: who they are, how they’re funded, and how they’re going to use the data. That way, Americans can make informed decisions about whether they want to share their personal information.
There’s some precedent for this. Gary Nordlinger, a professional in residence at The George Washington University Graduate School of Political Management, noted that, according to the American Association for Public Opinion Research’s code of ethics, pollsters can’t disclose information to clients or the public that could be used to identify people who participated in surveys without their permission.
Of course, advertisers target people based upon data we probably didn’t realise would be shared all the time. The majority of online ads today are bought through exchanges such as DoubleClick Bid Manager (which is owned by Google) and The Trade Desk. These platforms aggregate thousands of data points on individual consumers – like information from resorts about where they vacationed and data from auto loan providers about what kinds of cars they drive. So, for example, an advertiser can pay for an ad that reaches people in a particular ZIP code who own Priuses or have been to Disney World.
But that doesn’t make it right to dupe Americans out of more information without their knowledge. We should have the right to keep knowledge about how we think anonymous if we so desire. That’s much more deeply personal than data about what we buy or where we went to school. And such information could be much more useful to a political campaign or corporation trying to manipulate us.
Since quizzes and other questions posed on the Internet are essentially just online polls, the same ethical standards should apply. Of course, compliance with the American Association for Public Opinion Research’s code of ethics is voluntary. But the potential for a foreign government to use such data to try to influence a future election is such a serious threat to American democracy that we need a law to prevent this, rather than another industry code.
Congress should act now to protect us. In the meantime, let’s all be cautious about what we share on social media. I’m guessing the folks who scored highest in conscientiousness on those Cambridge Analytica quizzes will be most careful. — Bloomberg