The astonishing scope of election interference on the world’s biggest social platforms came glaringly to light following the 2016 presidential election. The issue is hardly resolved, so it should not come as a surprise that Facebook has created a team—and a room—dedicated to weeding out disinformation in the run-up to the U.S. midterm elections. It’s called—very seriously—the War Room.
The New York Times published a report detailing Facebook’s so-called War Room on Wednesday. According to the report, there are over 300 people working on Facebook’s push for election security, but only 20 people will work out of the War Room, “focused on rooting out disinformation, monitoring false news and deleting fake accounts that may be trying to influence voters before coming elections in the United States, Brazil and other countries.”
The War Room itself—which, according to a photo from the New York Times report, is literally labelled as such in fat, red letters on a sign taped to the door—lives on Facebook’s main campus in a newly constructed conference room. It’ll officially open on Monday, a little less than two months out from the U.S. midterms and just a month ahead of Brazil’s presidential elections. Samidh Chakrabarti, Facebook’s elections and civic engagement team lead, characterized the War Room as a “last line of defence” when it comes to rooting out disinformation around election periods.
Facebook, of course, drew inspiration from political campaigns when developing the room. It’ll use software created by the company to track content on the platform in real-time. The New York Times describes Facebook’s dashboards as follows:
“These dashboards resemble a set of line and bar graphs with statistics that provide a view into how activity on the platform is changing. They allow employees to zero in on, say, a specific false news story in wide circulation or a spike in automated accounts being created in a particular geographic area.”
According to the report, these dashboards will identify “unusual activity,” including content that could lead to violence in real life. As we’ve seen with the spread of disinformation in Myanmar—namely around hateful content targeting the Muslim population in the region—a failure to prevent the spread of this type of content has grave, real-world consequences.
Chakrabarti told the New York Times that the War Room team will also “actively remove posts” with misinformation around elections as well as acts of voter suppression. “The best outcome for us is that nothing happens in the War Room,” Chakrabarti told the New York Times. “Everything else we are doing is defences we are putting down to stop this in the first place.”
Facebook’s known anti-disinformation efforts, as they stand, are mainly outsourced to third parties, and have appeared largely experimental and sometimes biased. A Facebook spokesperson told Gizmodo last month that the social network does not have an in-house team dedicated to fact-checking, but this War Room points to a future in which Facebook begins tackling one of its most—if not its most—menacing issues from the inside.
But until that War Room work actually gets started, all we have is some unbelievably on-the-nose PR.