Facebook published its internal guidelines for its influential "Trending Topics" section Thursday, hours after The Guardian published similar leaked documents showing the company relies on the editorial judgment of its employees — and not just a computer algorithm — when promoting content to the section.
The 28-page document reveals the process editors at the largest social networking network use to identify potential trending topics, beginning with detection by a computer algorithm and followed by editors checking it against 10 major news sites.
"You should mark a topic as 'National Story' importance if it is among the 1-3 top stories of the day," the guidelines state. "We measure this by checking if it is leading at least 5 of the following 10 news websites."
Those sites include BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, and BuzzFeed News.
The guidelines also appear to contradict previous claims by a Facebook vice president that it does "not insert stories artificially into trending topics, and do not instruct our reviewers to do so."
But the policy states “the editorial team CAN inject a newsworthy topic that is not appearing in the review tool."
The document release comes just days after Gizmodo cited former Facebook editors who claimed they suppressed conservative news coverage from the trending module.
Facebook CEO Mark Zuckerberg responded to the allegations in a statement Thursday evening, saying "We have found no evidence that this report is true. If we find anything against our principles, you have my commitment that we will take additional steps to address it."
Zuckerberg said "in the coming weeks" he'll be inviting "leading conservatives and people from across the political spectrum" to talk with him about the issue. "I want to have a direct conversation about what Facebook stands for and how we can be sure our platform stays as open as possible," he said.
The alleged anti-conservative bias prompted a U.S. Senate committee on Tuesday to demand answers from Facebook about the social networking site’s trending topics. The Senate Committee on Commerce, Science, and Transportation sent a letter to Zuckerberg, asking if his news curators engaged in “politically motivated manipulation.”
“Facebook must answer these serious allegations and hold those responsible to account if there has been political bias in the dissemination of trending news,” Sen. John Thune, the committee’s Republican chairman, said in a statement. “Any attempt by a neutral and inclusive social media platform to censor or manipulate political discussion is an abuse of trust and inconsistent with the values of an open Internet.”
Justin Osofsky, vice president of Facebook's global operations, said the guidelines released Thursday show that the site does not "discriminate against sources of any political origin, period."
"Trending Topics uses a variety of mechanisms to help surface events and topics that are happening in the real world," he said. "In our guidelines, we rely on more than a thousand sources of news — from around the world, and of all sizes and viewpoints — to help verify and characterize world events and what people are talking about."
The Guardian reported that after the algorithm did much of the work in gathering content, a small team of editors would “inject” or “blacklist” stories from the trending module based on a set of instructions.
The guidelines show human intervention – and therefore editorial decisions – at almost every stage of Facebook’s trending news operation, a team that at one time was as few as 12 people:
A team of news editors working in shifts around the clock was instructed on how to “inject” stories into the trending topics module, and how to “blacklist” topics for removal for up to a day over reasons including “doesn’t represent a real-world event”, left to the discretion of the editors.
The company wrote that “the editorial team CAN [sic] inject a newsworthy topic” as well if users create something that attracts a lot of attention, for example #BlackLivesMatter.
A spokesperson for Facebook confirmed the authenticity of the Guardian's documents to BuzzFeed News, but said they were an “older version.” Facebook then published an updated version of the internal guidelines in a blog post and a list of the sites it uses in its algorithm.
The Guardian spoke to three former employees who confirmed the process and denied that their personal biases factored into the decision making.
"The guidelines demonstrate that we have a series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum," Osofsky said.