“Make sure you vote today to stop Labour ballsing up Britain” is the headline on the front of The Sun today, as Britain goes to the polls. “Send ‘em Packing”, is the Mirror’s counter - instead urging readers to vote Labour. As a glance at the papers will show, almost all of the dead tree press has endorsed a party in the election, in a bid to influence the way you vote.
This isn’t a new phenomenon - the press has always taken sides. One fun example is The Mirror back in 1951 splashing with a picture of a pistol saying “Whose Finger? Today YOUR finger is on the trigger”, endorsing Labour’s Clement Atlee over Winston Churchill.
It was widely believed that these endorsements mattered too - in 1992, so the legend goes, it was The Sun Wot Won It for the Tories. Here in the 21st Century though, the power of the dead tree press isn’t what it was. There’s perhaps a reason why this time around Ed Miliband is sitting down for interviews on YouTube with Russell Brand, and has never worried about winning Rupert Murdoch’s approval like Tony Blair used to.
So this raises the question: Who can influence an election now? If newspapers used to be the power brokers, could today’s social media platforms play a similar role? If the Daily Mail will happily say “Vote Tory”, why doesn’t Google change its Doodle to a playful animation showing us a vision of the dystopia that would emerge if Ed Miliband walks into Number 10? Could Mark Zuckerberg influence the outcome of the election?
The Power of the Facebook Algorithm
You could be forgiven for thinking that tech plays a neutral role in elections - but this is far from clear. And perhaps the biggest player of all - Facebook - is the best evidence of this so far.
In June last yearFacebook freaked everyone out when it revealed that the company had been manipulating the news feeds of some users to see if it could manipulate the emotions of it users. It did this by changing its algorithm.
Every time you go to Facebook the different collection of statuses, shared items and photos that you see are picked by the algorithm. Because there are so many different types of content, and potentially thousands of updates from friends, Facebook has to select what you get to see and what you don’t get to see. The upshot of this is that if as in the experiment the Facebook algorithm was to prioritise posts with a negative sentiment - perhaps words like “sad”, “death” and sad emojis, you might come away from thinking your friends are having a miserable time. If it prioritised items with a happy sentiment, you’d probably have an entirely different reaction.
It was an interesting case study as whilst it was creepy as hell, it wasn’t a million miles away from what Facebook, and pretty much every other major tech companies does as a matter of routine to test new code and ideas. So-called A/B testing is where a site presents different sets of users with different content, in order to measure how it affects behaviour. At its most benign, this could be something like changing the size of the “Share Photo” button to see if it encourages people to share more photos. Sentiment analysis is also increasingly common on social media. Many big brands use sentiment analysis to try to figure out if Facebook and Twitter users are saying nice or horrible things about their products.
Where this gets really interesting is when you consider elections. What if on election day Facebook was to post a message to all of its users in the country where voting is to take place, reminding them to vote? Could Facebook encourage greater voter turnout?
Amazingly, this isn’t a hypothetical. Facebook has already done it. Back in 2012 the company ran an experiment (with a mere 61 million users) in America, prompting users to vote. It worked in a couple of different ways: Some users got no prompting to vote, some got a simple reminder to vote at the top of their feed, and some got a reminder, with an “I voted” button, which would show which of their friends had also voted alongside their profile pictures.
The results are impressive: Whilst the people who got the simple reminder voted at the same rate as those with no reminder, those who got the “social” notification were two per cent more likely to click “I voted”. This may not sound like much, but it is thought that the experiment led to 340,000 more people voting than who would have - and two per cent can definitely matter in an election. It was essentially digital peer-pressure that did it - if you can see all your friends are voting, it might encourage you to get off your arse and get down to the polling station.
How Facebook Could Influence the 2015 General Election
If you’re in the UK and you went to Facebook this morning, this might sound familiar - as Facebook has done exactly the same thing today. In fact, it is even tracking the number of votes across the day.
This might sound relatively benign. I mean, who doesn’t want to see more people voting? Even Russell Brand changed his mind on that. But there are some profound implications for what it could mean for the power of the big social networks, and indeed for our democracy.
On a practical level, it could change the electoral battlefield. For example, one of the ‘laws’ of politics is that old people are more reliable voters than the young - which is why politicians so heavily court the grey vote by promising to protect pensions, and why they feel more able to screw over the young. If Facebook (and other social networks) can beat this trend by getting more young people to the ballot box, that could change how political parties position themselves. The immediate implication is that it stands that Labour and the Greens could gain an electoral advantage over the Conservatives and UKIP. But in time it could see all of the parties doing more to reach out to the young.
There’s a potentially sinister side too. The Facebook algorithm is basically a secret: nobody outside of the company knows exactly how it chooses what stories to feature. So imagine if Mark Zuckerberg wanted to be a supervillain, or wanted to back one specific party over another in the election. For example, imagine an outlandish scenario in which one party promised massive corporate tax breaks to companies with blue logos. From a corporate point of view, there could conceivably come a point when taking a partisan position is in the business interest of the company.
Given how much Facebook knows about us it would be relatively straightforward for Facebook to, say, only flag up the election to people that it thinks will vote the way it wants them to.
Even more subtly, rather than just on election day itself, it could conceivably tweak the algorithm over time to slightly prioritise posts that say nice things about the person it wants to get elected, or de-prioritise articles saying the opposite. And because of the algorithm’s invisibility, we wouldn’t even notice - it’d just be a sustained small psychological shift over time.
Again, there’s already evidence of Facebook doing similar to news stories - albeit not in a particularly partisan way. To the chagrin of many publishers, in February last year Facebook tweaked its algorithms to de-prioritise Upworthy-style “click-bait” articles. So whilst annoying news for Upworthy and users who like articles where they’ll never believe what happens next, the big takeaway from this is that the technology to be selective about articles that appear on your Facebook feed is already there.
Whilst there is no evidence that Facebook has done anything dodgy - nor any suggestion that it wants to - it certainly would have the power to do something like this if it chose to. Ultimately, the big tech players are not neutral arbiters of content - and they all have potentially huge amounts of unchecked power at their disposal - which could, conceivably, change the results of an election. We just have to hope that Mark Zuckerberg doesn’t spend his billions on a volcano lair and turn evil.
Despite today’s newspaper hubris, it isn’t The Sun wot will win it - but it could be Facebook.
Lead Image Credit: I'm Not Voting Labour or Conservative