“Don't chuck Britain in the Cor-Bin” is the headline on the front of The Sun today, as Britain goes to the polls. “Lies, damned lies and Theresa May”, is the Mirror’s counter - instead urging readers to vote Labour. As a glance at the papers will show, almost all of the dead tree press has endorsed a party in the election, in a bid to influence the way you vote.
This isn’t a new phenomenon - the press has always taken sides. One fun example is The Mirror back in 1951 splashing with a picture of a pistol saying “Whose Finger? Today YOUR finger is on the trigger”, endorsing Labour’s Clement Atlee over Winston Churchill.
It was widely believed that these endorsements mattered too - in 1992, so the legend goes, it was The Sun Wot Won It for the Tories. Here in the 21st Century though, the power of the dead tree press isn’t what it was. There’s perhaps a reason why this time around Jeremy Corbyn is sitting down for interviews on Facebook with Unilad, and has never worried about winning Rupert Murdoch’s approval like Tony Blair used to.
This raises the question: Who can influence an election now? If newspapers used to be the power brokers, could today’s social media platforms play a similar role? If the Daily Mail will happily say “Vote Tory”, why doesn’t Google change its Doodle to a playful animation showing us a vision of the dystopia that would emerge if Jeremy Corbyn walks into Number 10? Could Mark Zuckerberg influence the outcome of the election?
The Power of the Facebook Algorithm
You could be forgiven for thinking that tech plays a neutral role in elections - but this is far from clear. And perhaps the biggest player of all - Facebook - is the best evidence of this so far.
In June 2014 Facebook freaked everyone out when it revealed that the company had been manipulating the news feeds of some users to see if it could manipulate the emotions of it users. It did this by changing its algorithm.
Every time you go to Facebook the different collection of statuses, shared items and photos that you see are picked by the algorithm. Because there are so many different types of content, and potentially thousands of updates from friends, Facebook has to select what you get to see and what you don’t get to see. The upshot of this is that if as in the experiment the Facebook algorithm was to prioritise posts with a negative sentiment - perhaps words like “sad”, “death” and sad emojis, you might come away from thinking your friends are having a miserable time. If it prioritised items with a happy sentiment, you’d probably have an entirely different reaction.
It was an interesting case study as whilst it was creepy as hell, it wasn’t a million miles away from what Facebook, and pretty much every other major tech companies does as a matter of routine to test new code and ideas. So-called A/B testing is where a site presents different sets of users with different content, in order to measure how it affects behaviour. At its most benign, this could be something like changing the size of the “Share Photo” button to see if it encourages people to share more photos. Sentiment analysis is also increasingly common on social media. Many big brands use sentiment analysis to try to figure out if Facebook and Twitter users are saying nice or horrible things about their products.
Where this gets really interesting is when you consider elections. What if on election day Facebook was to post a message to all of its users in the country where voting is to take place, reminding them to vote? Could Facebook encourage greater voter turnout?
Amazingly, this isn’t a hypothetical. Facebook has already done it. Back in 2012 the company ran an experiment (with a mere 61 million users) in America, prompting users to vote. It worked in a couple of different ways: Some users got no prompting to vote, some got a simple reminder to vote at the top of their feed, and some got a reminder, with an “I voted” button, which would show which of their friends had also voted alongside their profile pictures.
The results are impressive: Whilst the people who got the simple reminder voted at the same rate as those with no reminder, those who got the “social” notification were two per cent more likely to click “I voted”. This may not sound like much, but it is thought that the experiment led to 340,000 more people voting than who would have - and two per cent can definitely matter in an election. It was essentially digital peer-pressure that did it - if you can see all your friends are voting, it might encourage you to get off your arse and get down to the polling station.
How Facebook Could Influence the 2017 General Election
If you’re in the UK and you went to Facebook this morning, this might sound familiar - as Facebook has done exactly the same thing today. In fact, this time around Facebook has got more aggressively hands-on than ever, and has built a widget that will pop up and tell users the positions of the different parties on a number of topic areas.
Obviously this appears to be a benign act. We all want a better informed electorate, and all indications are that Facebook is being scrupulously fair. In the above box, it randomises the order of the parties than the issues so that no one party is given an advantage. (Though presumably Facebook itself did get to choose the issue categories...)
This could have some profound implications for what it could mean for the power of the big social networks, and indeed for our democracy.
On a practical level, it could change the electoral battlefield. For example, one of the ‘laws’ of politics is that old people are more reliable voters than the young - which is why politicians so heavily court the grey vote by promising to protect pensions, and why they feel more able to screw over the young. If Facebook (and other social networks) can beat this trend by getting more young people to the ballot box, that could change how political parties position themselves. The immediate implication is that it stands that Labour could gain an electoral advantage over the Conservatives, given its wider support among younger voters. But in time it could see all of the parties doing more to reach out to the young.
There’s a potentially sinister side too. The Facebook algorithm is basically a secret: nobody outside of the company knows exactly how it chooses what stories to feature. Indeed, this is part of Facebook's competitive advantage over rivals.
Imagine if Mark Zuckerberg wanted to be a super-villain, or wanted to back one specific party over another in the election. For example, imagine an outlandish scenario in which one party promised massive corporate tax breaks to companies with blue logos. From a corporate point of view, there could conceivably come a point when taking a partisan position is in the business interest of the company.
Even more subtly, rather than just on election day itself, it could conceivably tweak the algorithm over time to slightly prioritise posts that say nice things about the person it wants to get elected, or de-prioritise articles saying the opposite. And because of the algorithm’s invisibility, we wouldn’t even notice - it’d just be a sustained small psychological shift over time. Remember the emotions experiment? We'd start to change our opinions over time, but we wouldn't be able to pin it on any one post or event. Such a change to what gets displayed could be so subtle, it could be done completely under the radar.
Again, there’s already evidence of Facebook using its power to choose what news we see - though it isn't doing it for partisan reasons. After the "fake news" fiasco in the 2016 American election, the company committed to tackling the problem, and now its algorithms will flag up stories that it believes are bullshit. In other words, the technology to be selective about articles that appear on your Facebook feed is already there.
Whilst there is no evidence that Facebook has done anything dodgy - nor any suggestion that it wants to - it certainly would have the power to do something like this if it chose to. And there are few checks on Facebook and other major social media companies, if they did want to wield their power in a partisan way.
Ultimately, the big tech players are not neutral arbiters of content - and they all have potentially huge amounts of unchecked power at their disposal - which could, conceivably, change the results of an election.
Essentially, we just have to hope that Mark Zuckerberg doesn’t spend his billions on a volcano lair and turn evil. He has the power - we just have to trust him not to use it.
Despite today’s newspaper hubris, it isn’t The Sun wot will win it - but it could be Facebook.