A Facebook Executive Said The Platform Is Responsible For Trump’s Election

"So was Facebook responsible for Donald Trump getting elected? I think the answer is yes.”

Facebook is still reckoning with its role in the 2016 election as it heads into a contentious 2020 presidential race, according to a newly leaked executive memo and a shifting set of policies around manipulated content issued Tuesday.

The 2,500-word memo from longtime Facebook executive Andrew “Boz” Bosworth, which was first obtained by the New York Times, outlined how the company views President Donald Trump’s success in the 2016 election as a result of a highly effective digital advertising campaign, and not of any untoward influence. Citing the Cambridge Analytica scandal, Bosworth described the notion that Facebook data was misappropriated to sway voters as “one of the more acute cases I can think of where the details are almost all wrong” but “the scrutiny is broadly right.”

"He got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser."

“So was Facebook responsible for Donald Trump getting elected?” Bosworth wrote on Dec. 30 in a post that was viewable only by Facebook employees. “I think the answer is yes, but not for the reasons anyone thinks. He didn’t get elected because of Russia or misinformation or Cambridge Analytica. He got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser.”

On Monday, President Trump made an appearance on Rush Limbaugh’s radio show claiming that during a dinner in October, Facebook CEO Mark Zuckerberg told him that he was “number 1” on the platform. While a Facebook spokesperson declined to comment on what was discussed at the dinner, Bosworth’s memo suggests Facebook leadership views the 2016 Trump campaign as a very well executed, if not ideal, Facebook marketing campaign.

Bosworth, one of the most outspoken and trusted of Zuckerberg’s lieutenants, is no stranger to controversy. In March 2018, BuzzFeed News obtained a post of his from 2016 in which he suggested Facebook’s mission was to connect the world, regardless of the positive or negative implications that came from that work.

“We connect people. Period. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it,” Bosworth wrote in June 2016. “Maybe someone dies in a terrorist attack coordinated on our tools.”

Bosworth subsequently claimed he didn’t agree with what he wrote and had posted it to provoke debate within the company.

While Bosworth has said that his December post "wasn’t written for public consumption,” one Facebook insider told BuzzFeed News a leak was likely expected given current turmoil within the company.

Bosworth’s comments on the Cambridge Analytica scandal, in which a political consulting firm that was eventually hired by the Trump campaign gained unauthorized access to the data of more than 50 million Facebook users, also echoed what many company insiders have thought for months about the firm’s impact. While some reporters and former political operatives have claimed that Cambridge Analytica used Facebook data to manipulate voters in elections around the world, much of that has been unproven. Bosworth called the scandal, which led to government inquiries and investigations around the globe, “a total non-event.”

As for 2020, the Facebook vice president, who did not respond to a request for comment, wrote that despite his liberal leanings, he has resisted the urge to “pull any lever” to favor of the political outcome he wants, citing the work of political philosopher John Rawls and the fantasy novels of J.R.R. Tolkien.

“I find myself thinking of the Lord of the Rings at this moment,” he wrote. “Specifically when Frodo offers the ring to Galadrial and she imagines using the power righteously, at first, but knows it will eventually corrupt her. As tempting as it is to use the tools available to us to change the outcome, I am confident we must never do that or we will become that which we fear.”

As Facebook’s communications team grappled with the fallout of its executive’s memo, it was simultaneously doing damage control on the clumsy rollout of a new policy regarding manipulated media, or more specifically altered video known as deepfakes. On Monday night, the company announced a new policy in which it said it would ban all content that had been edited by artificial intelligence to superimpose one video on top of another and would likely mislead an average person into thinking that a subject said something they actually hadn't.

The policy would not cover videos that were deceptively slowed down or “edited solely to omit or change the order of words.” In May, a video of Speaker of the House Nancy Pelosi that had been artificially slowed down to make her appear to slur her words was viewed millions of times on Facebook.

Having faced previous criticism for creating a policy that allowed politicians and political candidates to lie in ads, Facebook, through a spokesperson, initially told BuzzFeed News and other outlets that it would allow deepfakes in political advertisements. About an hour later, that spokesperson said they misspoke, saying that deepfakes would not be permitted in any type of ad.

There was still the possibility, however, that a political figure would be allowed to post deepfake or manipulated content if Facebook were to deem that content to be newsworthy. Facebook’s executives have long said they do not want to be arbiters of newsworthiness, and a spokesperson declined to say who would be making judgment calls on a political figure’s postings of manipulated content.

Instead, he pointed BuzzFeed News to a 2016 policy that did not directly address posts from politicians.

Topics in this article

Skip to footer