Facebook Tried to Break its Users by Deliberately Breaking its Android App

By Gerald Lynch on at

Facebook is coming under fire for once again carrying out questionable experiments on its users. A report from tech journal The Information accuses the social network of deliberately disabling its Android app for selected users in order to monitor at what point they'd stop coming back.

Undoubtedly to Facebook's pleasure, the experiment showed that the network's users are well and truly addicted. Despite the down time lasting considerable stretches for those that were part of the study, users repeatedly attempted to access the network, either through re-opening the app or eventually opening the mobile version of the site. "People never stopped coming back”, claimed a source familiar with the testing.

So why would Facebook wish to bork its own app? According to The Information, it was an attempt to test a DEFCON 5 major offensive from rivals Google. Should Google ever see fit to pull the Facebook app from the Google Play store, Zuckerberg's team wanted to analyse whether users would look for other means of access.

But that's not a good enough excuse, given how Facebook positions itself as an integral part of daily life. As well as being the first port of call for connecting to friends and hosting billions of photos, Facebook wants to be your first port of call in times of emergencies and disasters, offering the option to donate to charity and influencing the prosperity of businesses through its brand pages and advertising. Facebook wanted to become a juggernaut without a meaningful alternative, woven into our lives, and it's achieved that. As such, it's now arguable that it has a moral duty to keep its access stable and transparent.

Though the latest revelatory tests only happened once "several years ago", it's not the first time Facebook has secretly messed with its users. In 2014, Facebook was criticised for manipulating users' newsfeeds, showing some a majority of posts negative in tone and others a more positive bias. The aim was to see if Facebook could have an affect on the tone of a subject's own output, tracking whether or not their posts would mirror the positive or negative content they were exposed to. [The Information via Guardian]

Want more updates from Gizmodo UK? Make sure to check out our @GizmodoUK Twitter feed, and our Facebook page.