How The 2016 Election Blew Up In Facebook’s Face

As Facebook attempted to capture the fast-moving energy of the news cycle from Twitter, and shied away from policing political content, it created a system that played to confirmation bias and set the stage for fake news.

As midnight approached on November 6, 2012, a newly re-elected Barack Obama tweeted a photo of himself embracing the first lady, with the message “four more years.” The president’s election night tweet went unprecedentedly viral, racking up more than 500,000 retweets within a few hours and capping the most active day in Twitter history. Meanwhile, on Facebook, a network four times Twitter's size, the same photo had fewer than 100,000 shares — evidence the energy around current events was elsewhere. It was a triumph for Twitter, a rare occasion when the little bird out-sang its big blue brother. At Facebook's headquarters in Menlo Park, it was an alarm bell ringing loudly in the middle of the night, a call to action.

Four years later, Facebook couldn’t be more relevant. The platform played a defining role in the 2016 election — but perhaps not for the reason it hoped. In the days after the vote, it’s come under fire for creating an infrastructure that played to confirmation bias and allowed political-meme-makers, sensationalists, and fake news purveyors to thrive — and perhaps even alter the election’s outcome. The company’s influence was so apparent that when CEO Mark Zuckerberg denied that the fake news coursing through its system influenced the election his own employees disputed him.

And while it was likely never the company’s intent to create a system that encouraged people to hear only what they wanted — whether or not it was true — Facebook didn’t get here by accident. It made a huge push over the last four years to be a destination for news, indeed, to be your "perfect personalized newspaper." Since that Obama tweet, the company retooled its platform, creating a system designed to make it easier to share and promote timely and trending stories and to help them spread rapidly across its network. In the process, Facebook, with its 1.79 billion monthly active users, grew to more than five times the size of Twitter.

The promise of tapping into this lightning enticed massive news organizations to go all in on Facebook. But those enhanced sharing and interaction mechanisms, coupled with changing platform dynamics, created a system that catered to reinforcing existing world views. Facebook’s News Feed algorithm prioritizes sharing and time spent reading articles, but in a scroll-through world these measurements are prone to reward material that doesn't challenge you. This in turn enabled a host of loose-with-the-truth upstarts to use it, at times, even more successfully than mainstream news organizations to go mega-viral.

Essentially, Facebook built a petri dish for confirmation bias, developing ideal conditions to grow news sources whose mission was to provide people with fodder that backed up their own beliefs. Here’s how it got there.

It Starts With The Share

Facebook’s transformation began almost immediately after the 2012 election. On November 14, 2012, a full eight days after the vote, a TechCrunch headline proclaimed: “Facebook Finally Launches ‘Share’ Button For The Mobile Feed, Its Version Of ‘Retweet.’”

The move, seemingly minor at the time, set the table for a behavior shift on Facebook, encouraging people to share quickly and without much thought. That in turn helped all forms of content boom across the network. As the TechCrunch article astutely noted: “When people do use the Share button on the web, they often give their own description of a link. But on mobile where typing is more of a pain, a Share button could encourage people to rapidly re-share link after link.”

The mobile share button would help links surpass text and photos as the fastest growing form of content shared on Facebook. This would prove crucial in an unexpected but very important way not long down the road.

Fake It ‘Til You Make It

It’s no coincidence that Jestin Coler started National Report, his wildly successful fake news site, only a few months after Facebook added the mobile share button. The California-based satirist watched in a bit of amazement as articles from fringe conservative news sites began booming across Facebook, and decided he wanted in on the action. “I was seeing those sorts of sites all over the place with large followings and they were getting good traffic and I just thought to myself, Well I could do that,” Coler told BuzzFeed News. And so he debuted National Report in February 2013.

Coler could have reported the news, or simply blogged. But he noticed that fringe political pages would pick up just about anything that helped them make their point, including fabricated news. So National Report began publishing fake news about gun control, abortion, and President Obama, which Coler suspected would set off the right. It sure did. The sites quickly began aggregating his stories. “We really went for the confirmation bias thing,” Coler said. “What we assumed people wanted to hear, that was really what we were selling.”

National Report had impact. Real impact. Shortly after Coler fabricated a story about food stamps being used in Colorado to buy weed, the Colorado legislature introduced a real bill to ban the practice.

“That one was a fun one for me,” Coler said. “It’s not that hard to get people to share that stuff and to get the right-wing folks all riled up. It’s kind of scary how easy it is, really.”

Getting Trendy

The new share button wasn’t the only feature Facebook added to get timely content spreading across its platform. In March 2013 it introduced hashtags, which, as AllThingsD pointed out, were borrowed straight from Twitter: “Imitation is indeed the sincerest form of flattery. And Twitter should be very flattered right now.”

Five months later, on August 6, 2013, Facebook held a press event that, according to TechCrunch, repeatedly “emphasized real-time content” in its News Feed. The next day, Facebook added another feature: Trending. Facebook emphasized that Trending, a module that highlights some of the most talked about content on Facebook, was a small test, and promised to “share more details down the line if we decide to roll it out more widely.”

Adding a slew of Twitter-like features gave Facebook a new capability to rapidly disseminate of-the-moment content. Facebook still wasn’t as fast as Twitter’s real-time updates, but it didn’t have to be. It was fast enough, and its 1 billion monthly active users, compared to Twitter’s 250 million or so, made up in mass what it lacked in energy.

“These features gave people visibility into what people were talking about in the moment,” a former Facebook product manager told BuzzFeed News. “Now anyone, not just public figures, could add their voice to bigger conversations across the internet."

Those changes set up Jestin Coler’s National Report for a big year. “2014 ended up being huge for us,” he said. National Report received around 30 million page views that year, he said, almost all thanks to Facebook.

By November 2014, Mark Zuckerberg was feeling pretty confident in Facebook’s capacity to deliver the latest news, enough so that he likened Facebook’s aim, incredibly, to that of a newspaper’s. “Our goal is to build the perfect personalized newspaper for every person in the world,” he said. “We’re trying to personalize it and show you the stuff that’s going to be most interesting to you.”

To build this “perfect personalized newspaper,” Facebook had to make the News Feed as interesting and relevant to people as possible. And to do that, it engaged in a number of quality-improving measures, including surveying its users on what they found valuable, and optimizing for time spent reading stories after a click from the Facebook News Feed, in addition to measuring traditional metrics like the number of shares and likes.

These moves were well-intentioned but fundamentally flawed. You’re unlikely to spend much time reading and interacting with material you disagree with. As another former Facebook employee told BuzzFeed News, “Even though they could show you stories that you disagree with, they’ll probably not because chances are you’ll spend more time on Facebook if you are seeing stuff you agree with and that you like.”

Fake news sites just out for a profit, and fringe websites trafficking in propaganda, both benefitted enormously. And a Facebook now outfitted with the mechanisms to make stories rapidly propagate turbocharged their rise.

The First Steps Against Fake News

As early as August 2014 Facebook knew it had a growing problem on its hands, but approached it with kid gloves, pledging to mark links from hoax websites with a “satire” label. The Washington Post, in a story calling out the “terrible hoax-news industry,” explained its necessity. “A top post on Empire News will frequently boast more than a quarter of a million Facebook shares, far more than on any other social platform,” the Post reported. ”As that information spreads and mutates, it gradually takes on the pall of truth.”

In January 2015, Facebook got more aggressive. It wrote a News Feed FYI blog post with the title “Showing Fewer Hoaxes.” People were complaining about fake news and hoaxes, the blog post said, so Facebook would diminish these posts’ reach. “A post with a link to an article that many people have reported as a hoax or chosen to delete will get reduced distribution in News Feed,” the post said. And with that, Facebook began a public fight against fake news.

But the filter bubble was still getting bigger than ever, and it was about to get worse due to a change in the way people were behaving on the site.

Trumped Up

On June 16, 2015, Donald Trump hit the ground in Iowa and began campaigning for president, taking shots at his fellow Republican contenders and telling the news media that only he could beat Democratic frontrunner Hillary Clinton. A Bloomberg story from that day said “Democrats joyfully welcomed Trump's entry into the race.”

Clearly, these Democrats hadn’t yet taken notice of his Facebook page. Two days later, a Trump post about immigration would receive more than 190,000 shares on Facebook. That was almost twice as many shares as Obama’s election night post had three years earlier.

With the presidential primaries underway, another fundamental change was taking place inside Facebook, one responsible for giving even more prominence to posts from news organizations and public figures, though the company had not yet started reckoning with it.

So-called “original sharing,” where people post their own photos, text updates, etc., instead of simply pressing “share,” was declining. The extent of Facebook’s original sharing problem came to light in an April 2016 article in The Information, which reported that original sharing was down by 21% in mid-2015 compared to the previous year. With sharing down, content from celebrities, political candidates, and news sites began to fill that void. Faceboook’s algorithm was already turning the platform into a playland for confirmation bias content, and the original sharing decline gave it yet another boost.

Facebook continues to push back on this notion, and pointed BuzzFeed News toward comments Mark Zuckerberg made in an earnings call in April. "Overall sharing is up across Facebook," he said.

Facebook also maintains that the mix of what people see in NewsFeed has remained unchanged. "In general the percentage of News Feed that has been links and therefore also news distributed through links has been very consistent over time," Facebook Product Management VP Adam Mosseri told BuzzFeed News in an interview. "The vast vast majority of stories in people's News Feed are not news."

And meanwhile, Donald Trump, the iconic original sharer, was building steam.

The Appearance Of Bias

Now that it had new tools in place to identify and promote trending stories, Facebook took another step that it largely kept secret: It began making editorial decisions about the content that would appear in Trending. The company had hired a team of humans to curate its Trending column, but in August 2015, it told Recode that its algorithms alone were responsible for deciding what ended up there. “These people don’t get to pick what Facebook adds to the trending section,” Recode reported. “That’s done automatically by the algorithm. They just get to pick the headline.”

Not so. In May 2016, Gizmodo published an explosive story reporting that these human curators “routinely suppressed conservative news.” The article, quoting the curators themselves, found that there was indeed human judgement involved in what appeared, and didn’t appear, in the Trending column. A conservative member of the curation team told Gizmodo that right-leaning Trending topics were regularly omitted.“I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz,” the curator said.

The revelations created a major scandal, one so damaging that Zuckerberg went as far as to admit “I know many conservatives don't trust that our platform surfaces content without a political bias” in a post to his Facebook page — even through Facebook claimed it found no evidence of wrongdoing.

The day of Zuckerberg’s admission, the CEO hosted a meeting of conservative leaders inside Facebook to address the controversy. Conservative publisher Brent Bozell, who attended the meeting, told BuzzFeed News he walked away believing Facebook was sincere in its efforts to mend fences. “Everyone in that room wants the trust to be restored,” he said. “Trust is everything in this business.”

The election was less than six months away.

Fake News, Real Resilience

By the start of 2016, Facebook had been doing battle with fake news for over a year, and it wasn’t winning. A BuzzFeed News analysis of thousands of Facebook posts published across nine top fake news sites found that the average engagement (likes + comments + shares) per post for these sites’ Facebook pages was higher in February 2016 than it was in January 2015. In its analysis, BuzzFeed News used data from CrowdTangle, a social media analytics company that Facebook thought so highly of it acquired it right after the election.

A story from Jestin Coler’s National Report even made its way into a December 2015 New York Times gun control editorial and sat there for months. When BuzzFeed News contact the Times about the story, it removed the link and issued a correction.

Though BuzzFeed News looked at nine fake news sites in its analysis, there was a 10th, American News, that didn’t make it in. American News, liked by more 5 million people on Facebook, was bigger than the rest of the sites in the analysis by a wide margin. American News is not a fake news site, per se, but it often takes a nugget of truth and writes a dramatically exaggerated story around it with a conservative slant. After Bernie Sanders wrote an op-ed in the New York Times reacting to Trump’s victory saying he's willing to work with the president-elect under some circumstances, for example, American News published a story with the headline: “Sanders Completely Turns On Hillary, Now Backing Trump.” (Sanders, in no part of his op-ed, said he supported the president-elect.)

American News is only one of a number of publications of its kind on the internet. You can think of them as mermaids. Seen from the surface, all appears to be normal. But go deeper, and things start to look fishy. These sites are masters at heavily slanting news to play on confirmation bias, and some are very, very popular.

Breitbart, a populist conservative website with a massive readership and 2.8 million Facebook fans, can sometimes stray into mermaid territory. In August, for instance, it used a picture of the Cleveland Cavaliers championship parade in a story about a massive Jacksonville Trump rally.

These mermaids made Facebook’s self-assigned job of policing fake news extremely difficult. Zuckerberg, in a Facebook post last Saturday, did his best to explain this minefield. “Identifying the ‘truth’ is complicated,” he said.

This was further exacerbated precisely because it had already come up against accusations that it was exhibiting bias against conservative news sources.

As the election approached, Facebook thus developed an “internal culture of fear,” when it came to fake news, according to Gizmodo (Facebook disputed elements of Gizmodo's report). If Facebook were to take action to limit spread of fake news, it would do so under intense scrutiny since it was already distrusted by conservatives for this very same action; suppressing news slanted to their viewpoint. In a world where a spectrum of truth exists, Facebook couldn’t find the place to draw the line.

If you want to know what Facebook would have faced if it took dramatic action against fake news, just ask Bozell, the conservative publisher who joined the meeting with Zuckerberg. “Would there be a backlash if they tried something? Of course there would,” he said. “There also be a backlash from the left if they tried it against the left.”

Asked what would happen if Facebook deleted fabricated stories about the pope endorsing Donald Trump, for instance, Bozell said the move would unleash a world of trouble. “There would be a vicious reaction from those who would point to [other] people on Facebook saying that Trump was a member of the Klan or that Steve Bannon is a white supremacist,” he said. “If Facebook were to make the decision that it was going to look at the conservative movement and delete what it called ‘fake news,’ that would be a big big big big big mistake.”

The Triumph Of The Filter Bubble

In this environment, both your right-wing uncle and left-wing aunt got drunk on fake news (and spin and propaganda), sharing shot after shot of a content format designed to appeal to those on both sides of the political spectrum that was now moving across the platform with relative impunity.

In an analysis of more than 1,000 posts on Facebook pages across the political spectrum, BuzzFeed News found that some of the least accurate pages registered some of the highest engagement. And the election neared, some of the top "news" stories from hoax sites generated more engagement than the some of the top news stories from non-hoax sites.

Perhaps more importantly long-term, agenda-driven sites like Breitbart saw their influence skyrocket while this was taking place. Breitbart Chairman Steve Bannon joined the Trump campaign as CEO in August 2016. And on November 8, Bannon not only watched his boss win the presidency, he witnessed his in-the-can for Trump website beat out CNN, Fox News, and the New York Times on Election Day for Facebook interactions. Bannon last week was picked as President-elect Trump’s chief strategist. Breitbart, a publication many once considered fringe, is now the definition of mainstream.

On the day following the election, Mike Masnick, the editor of Techdirt, tried to make sense of Facebook’s role in the outcome, and wrote an article succinctly headlined "If You're Blaming Facebook For The Election Results, You're An Idiot." “People are believing those stories because they match with their real world experience of seeing how the system has worked (or not worked) for too long,” Masnick wrote.

Indeed. Fake news and sensationalist news would be relatively ineffective without the existing worldview they confirm. But with that backstory in place — a distrust of "the system" held by millions of Americans — Facebook provided the accelerant for this stuff to spread and take hold across the country during a deeply contentious election.

Coler, the fake news site proprietor, said he had no regrets about his role in helping develop the fake news genre. He maintained that stance when pressed about how his initial goal of seeding fabrications into conservative publications turned out to be far more serious than originally thought. “I think it’s depressing,” Coler said of people’s inability, or unwillingness, to distinguish real from fake. “At the end of the day, the confirmation bias thing is the keyword. You can sell somebody anything, and Trump certainly was a master at that. Confirming what people want to believe.”

CORRECTION

A previous version of this story quoted Jestin Coler under the name he provided, Allen Montgomery, which was fake.


Skip to footer