Yesterday, Facebook posted a detailed explanation of its counter-terrorism program, defending itself from criticism by European leaders in the wake of recent terror attacks in Britain and France and stating there is “no place on Facebook for terrorism.” But any goodwill earned by that post seems to have lasted less than a day, as a report revealed on Friday that a “bug” affecting more than 1,000 Facebook content moderators inadvertently exposed some of their identities to suspected terrorists.
According to The Guardian, the bug meant that moderators’ personal profiles appeared as notifications in the activity logs of groups whose administrators they had removed from the site. Facebook told the newspaper that it is now testing the use of anonymous admin profiles instead of requiring moderators to use their personal accounts. That seems like the kind of thing they should have been doing already, but what do I know, I’m not the world’s largest social network.
40 of the exposed moderators worked in the company’s counter-terrorism unit, and the company concluded that six of those workers’ profiles had likely been seen by potential terrorists. One moderator, an Iraqi-born man who moved to Ireland, quit his job and fled to Eastern Europe after his identity was revealed, returning only after “running out of money.” The moderator told The Guardian that he is still unemployed, has anxiety, and is on antidepressants. He is seeking compensation from Facebook and the contractor, Cpl, who employed him.
In their post yesterday, Facebook boasted of its capacity to review terrorist content:
Reports and reviews: Our community — that’s the people on Facebook — helps us by reporting accounts or content that may violate our policies — including the small fraction that may be related to terrorism. Our Community Operations teams around the world — which we are growing by 3,000 people over the next year — work 24 hours a day and in dozens of languages to review these reports and determine the context. This can be incredibly difficult work, and we support these reviewers with onsite counselling and resiliency training.
The Guardian reports that the Iraqi-born moderator was paid just $15 an hour for this job, despite developing a “specialist knowledge of global terror networks” and having to “scour through often highly-disturbing material.” His employment status seems to confirm speculation that the company’s “Community Operations teams” are likely contractors, not Facebook employees. Contractors generally receive fewer benefits and legal protections than full-blown Facebook employees. While many are reportedly based in the Philippines, it’s not clear where all these contractors are located, other than “around the world.”
Facebook reportedly offered to install home alarms and provide travel escorts for the six most exposed workers, and paid for counseling above that offered by the contractor. But it’s not clear whether Facebook has offered any financial restitution to the now-unemployed moderator who fled to Eastern Europe.
The company had been criticised by European leaders, particularly in the UK and France, for failing to stamp out terrorists’ use of the site. After the recent attacks in London and Manchester, prime minister (for now) Theresa May said that terrorism had “the safe space it needs to breed” thanks to the internet, and the UK home secretary Amber Rudd specifically criticised Facebook-owned Whatsapp in the past.
Facebook has not returned our request for comment.