On Friday, Gizmodo uncovered shocking new evidence that Facebook is using its platform to suppress stories about CEO Mark Zuckerberg... or maybe his janky, busted website is just bugging out again for no reason. It’s hard to say, really. That’s sort of the problem.
The issue we had with Facebook serves as a miniature lesson about transparency and our mistrust of big tech. For some reason, a story about Zuckerberg we posted to our Facebook page was hidden from many readers. The post was fully visible through web browsers in incognito mode, but an unclear percentage of users were told, “Sorry, this content is not available,” when they tried to view it while signed in.
In short, lots of people (including several Gizmodo staffers and at least one of their parents) could not see the story. By Friday afternoon, the issue seemed to resolve itself just as mysteriously.
My dad is texting me b/c he thinks Facebook is making Giz articles disappear pic.twitter.com/ToRhI96l7p
— dead cameron (@dellcam) October 4, 2019
Was it a bug, a moderation error, or something more nefarious? Personally, I find it hard to imagine Zuckerberg furiously refreshing Gizmodo’s page, just waiting to slam the giant red button on his desk labelled “WRONGTHINK.” But it’s easy to see why some people believe similar (if less cinematic) conspiracy theories.
When Facebook acts strangely – which is fairly often! – users have to draw their own conclusions about what’s happening. Like most big tech companies, Facebook doesn’t offer a phone number to call if you’re having issues. If you want a response from a social network about your specific problem, your best bet is to be a journalist, a celebrity, or someone else with the power to give headaches.
To understand their experiences with social media, then, most people are left with two choices: trust the system (lol) or develop their own, potentially very wacky, explanations.
We can see this in a common conspiracy theory that posits Facebook is secretly recording users’ conversations – to show them more relevant ads, of course. The supporting evidence is anecdotal. After verbally discussing one product or another, a user notices an advertisement for it on Facebook. In a 2016 blog post, the company denied the rumour outright, and despite subsequent revelations about overtly recorded audio, proof of the claim has never been found.
What’s happening then? Something much creepier, probably. Using both the information you share and that provided by your friends and family, the platform can infer your interests so well that it might as well be listening in. At least that’s what we think! Some attentional bias might also be at play. Truthfully, even the best-informed journalists are kind of in the dark.
Content moderation bias is perhaps an even more contentious issue, and here, at least, Facebook has something approaching an answer. What looks like targeted censorship, you see, is just a series of screw-ups. In a transcript of an internal meeting published by the Verge on Thursday, Zuckerberg blamed “most” of the platform’s moderation issues on the judgement of individual workers:
[P]eople say “oh well no, you just did this because you’re trying to censor some group of people” or “you just did this because you don’t care about protecting this group of people.” It’s really not that. [...] It’s just that there’s one thing to try to have policies that are principled. It’s another to execute this consistently with a low error rate, when you have 100 hundred billion pieces of content through our systems every day.
Incidentally, in that same story, the Verge assured readers that Zuckerberg’s remarks were not deliberately leaked by Facebook. But how could they know? Short of mind-reading (something Facebook is working on, by the way), it’s impossible to completely ascertain the motives of a source, much less the full provenance of the information they offer.
“Knowing” things about Facebook – or any of the other enormous tech companies that now control much of our lives – is a frustrating game. They will talk your ear off with generalities, but if you want to learn, say, just how many police departments Amazon is partnering with to sell surveillance doorbells, you better do some paperwork, buddy.
Some may believe – as Zuckerberg himself seems to – that companies like Facebook are just too big to explain every little thing they do their millions of users. Maybe so, but is it any surprise, then, that no one trusts them?
We reached to Facebook about the weird ghost post issue and will update this story if and when they reply.
Featured image: Getty