Facebook is enacting a number of changes to its trending news module following a two-week internal investigation. The company’s announcement comes in response to a letter of inquiry from the US Senate Commerce Committee, issued one day after Gizmodo reported on the allegations of one former “news curator” for the trending section, who alleged coworkers regularly suppressed topics of interest to conservative readers.
Facebook will no longer rely on external news websites or RSS feeds to “identify, validate, or assess the importance of trending topics” according to a statement from the company. Former news curators who spoke to Gizmodo on the condition of anonymity said that these websites and RSS feeds were sometimes used to insert trending topics into the section that were not organically trending on the site. And Facebook later stated that a select group of 10 publications, including the New York Times, the Wall Street Journal, and Buzzfeed, were used to determine whether a story was important enough to be included in the trending section.
Facebook is also renaming some of the tools its curators use to moderate the trending news section, in order to “better reflect the real nature of the action[s].” Most notably, the “blacklisting” tool — used to block naturally trending topics from inclusion in the trending section — will be renamed “revisit”. The “injection” tool, used to insert trending topics or combine several topics into one, is also being reframed as a “topic correction” tool.
The company’s 12-page report also details the results of its internal investigation, which sought to determine whether any bias had impacted its trending news section. Facebook states that this investigation found no evidence of “systematic bias”, and that conservative and liberal topics were approved for the trending section in equal frequencies. As Gizmodo originally reported, several former news curators said they’d never been instructed to systematically suppress conservative news, but one former curator kept a running list of topics the curator felt were inappropriately blacklisted or disregarded by colleagues.
Facebook said in its report that prior to July 2015, topics could have been prevented from the trending module if they weren’t covered by major news organisations:
The investigation did reveal that—prior to July 2015—reviewers followed guidance that did not permit the acceptance of a topic if one of the first 12 posts (the “feed”) associated with that topic did not include a post from a news organization, a primary source, or a verified profile or page. This guidance may have in some instances prior to that date prevented or delayed acceptance of topics that were not covered by major news organizations.
The timeframe that Facebook investigated is vague in the company’s report. The trending news section launched in January 2014. It appears that Facebook was only able to access data dating back to December of that year. “We could not reconstruct reliable data logs from before December 2014, so were unable to examine each of the reviewer decisions from that period,” the report says.
The report says “rates of ‘boosting’, ‘blacklisting’, and accepting topics have been virtually identical for liberal and conservative topics” but the report notes that the analysis only spanned the last 90 days. The former curators Gizmodo interviewed worked for Facebook from mid-2014 to December 2015.
“Despite the findings of our investigation, it is impossible to fully exclude the possibility that, over the years of the feature’s existence, a specific reviewer took isolated actions with an improper motive,” the report says.
Senator John Thune, chairman of the US Senate Committee that requested information from Facebook, said in a statement that he appreciated Facebook’s efforts to seriously address the allegations. “Facebook’s description of the methodology it uses for determining the trending content it highlights for users is far different from and more detailed than what it offered prior to our questions,” Thune said.
“We now know the system relied on human judgement, and not just an automated process, more than previously acknowledged. Facebook has recognized the limitations of efforts to keep information systems fully free from potential bias, which lets credibility to its findings.”
Facebook’s General Counsel Colin Stretch said: “This process has helped us to identify valuable improvements to our service. These improvements and safeguards are designed not only to ensure that Facebook remains a platform that is open and welcoming to all groups and individuals, but also to restore any loss of trust in the Trending Topics feature.” [U.S. Senate Commerce Committee]