Facebook's Two-Way Mirror

How the social network's transparency push may be fueling recent conspiracy theories.

Last August Facebook announced News Feed FYI, a transparency effort in the form of a series of blog posts aimed at explaining changes behind the social network's News Feed. Here's how engineer Lars Backstrom explains the initiative:

We are continually working to improve News Feed and from time to time we make updates to the algorithm that determines which stories appear first. We've heard from our users and Page owners that we need to do a better job of communicating these updates. Starting today, we're going to try and change that. News Feed FYI blog posts, beginning with this one, will highlight major updates to News Feed and explain the thinking behind them.

The blog was created in part as a blanket measure to preempt individual engineer responses to specific articles or theories about News Feed and, more importantly, to assuage the fears of users, brands, and publishers regarding its mysterious and powerful ranking and sorting algorithms.

The posts are billed as a "window into News Feed" but the view is tightly controlled. Since August, there have been five News Feed FYI posts, all with similar structures, intended to communicate a very specific company message. Each of the blog posts essentially begins in the same way:

"The goal of every update to News Feed is to show people the most interesting stories at the top of their feed and display them in the best way possible." —8/6/13

"The goal of every update to News Feed is to show people the most interesting stories at the top of their feed and display them in the best way possible." —8/23/13

"The goal of News Feed is to deliver the right content to the right people at the right time. Our goal with the ads we show in News Feed is no different." —9/27/13

"The goal of News Feed is to show the right content to the right people at the right time whether it's from a close friend or a news source halfway across the world. In the last year, more people found news on Facebook than ever before." —12/2/13

"The goal of every update to News Feed is to show people the most interesting stories at the top of their feed and display them in the best way possible." —1/21/14

The message isn't exactly subtle. Facebook News Feed is looking to find "interesting stories" and to display them in the "best way possible" and at the "right time."

This week, Slate's Will Oremus argued that the internet (with an emphasis on journalists) has a "collective willingness to believe just about anything about Facebook so long as it looks salacious." And, while there is some truth to this, it's a little easy on Facebook. Facebook's unsatisfying "transparency push" invites not just conspiracy theories but reasoned critiques; the posts are vague to a point that they suggest new questions without answering the old ones first.

Mostly, the posts alternate between advising publishers to create and promote "high quality content" and assuring users that ads on the site will be of a higher quality and more relevant than ever before. But the transparency pretty much stops there. In a December update, News Feed engineers noted that the site would begin "doing a better job of distinguishing between a high quality article on a website versus a meme photo hosted somewhere other than Facebook" but never made it clear where the line between "high quality articles" and "meme photos" would be drawn. Similarly, an August post details the survey questions that engineers asked thousands of users to understand how they define high-quality posts on the network but doesn't bother divulge the answers, or anything about the answers, noting merely that the survey was used to help refine an algorithm.

Facebook, like Google, still treats its algorithms as competitive secrets. This is, if not ideal, to be expected: Small Google PageRank and search disclosures are most often treated by publishers and marketers as SEO guides, providing little practical value to anyone else. These blog posts, despite their decided lack of disclosure, inevitably prompt their own small news cycles, calling attention to and inviting new speculation about what makes News Feed tick.

In the context of Facebook's growing influence on publishers, it's easy to see how this speculation is intensified and amplified as they nervously await a "correction":

Facebook's posts seem to have done very little to quell anxieties about News Feed tweaks; they may have only exacerbated the issue. As The Atlantic's Derek Thompson noted yesterday, Facebook's average user (and probably the average brand and journalist) likely views News Feed as a mirror that reflects the user's online habits and preferences, which is precisely the image Facebook has cultivated for years.

Facebook is enormous, profitable, and powerful. But that doesn't mean it can expect to have things both ways. If it wants to be seen as truly transparent when it comes to its most important product, then it must be willing to compromise. This means disclosing, to the best of its abilities, the governing properties and subtle tweaks of News Feed — how it defines and separates quality content from less desirable content, and what those judgments even mean; and in cases of information it can't reveal, explaining why that's the case in plain language.

For now, the only significant disclosure in these posts is between the lines: That Facebook's algorithms, which it has traditionally held not just by users but by the company itself as reflective, contain or are influenced by some sort of human editorial sensibility — that they are, in a real way, subjective. To wonder what that subjectivity looks like, and where it's coming from, isn't paranoid or irresponsible. It's the most obvious line of inquiry, and one that Facebook opened all by itself.

Facebook's mirror, in other words, has become a pane of two-way glass. What's behind it is left to our imaginations: a human, a machine, an editor, an engineer. A team of people in white coats watching and listening, their hands on the dials.

Skip to footer