Former Facebook Employees Say The Company’s Recent Prioritization Of Privacy Is All About Optics

Last May, Facebook promised to create a “Clear History” function it said would give users more control over their data. Nine months later it's nowhere to be found and sources say it's a key example of the company's “reactionary” way of dealing with privacy concerns.

Facebook spent most of 2018 embroiled in one scandal or another. But there was a point early on in the year when Mark Zuckerberg thought he could turn down the heat by offering a fix for the public’s privacy concerns. It was just weeks after the news broke that political consultancy Cambridge Analytica had surreptitiously obtained and employed the personal data of millions of people. And as the company headed into its annual F8 developers conference in May, the chief executive proposed a dramatic change ahead of a rehearsal for the keynote address: What if they announced a tool that let users clear web-browsing information that Facebook uses to target users with ads?

The suggestion caught people involved in the event’s production, where planning begins at least six months in advance, off guard. “Clear History” didn’t exist; it was barely an idea. But organizers still scrambled to build its announcement into Zuckerberg's F8 keynote address. They'd already scrapped plans to unveil Portal, a video calling device that Facebook's leadership thought might be seen as too invasive given the company’s predicament.

It was a bold public relations play. And for those familiar with the origins of the Clear History announcement, it demonstrated not only Zuckerberg’s unilateral power over product direction, but also Facebook’s long history of prioritizing optics and convenience over substantive protections for the people who use it. Company sources who spoke to BuzzFeed News characterized Zuckerberg’s proposal as “reactionary,” a response intended to ease the negative attention on the company following the Cambridge Analytica firestorm. They also said it might explain why the Clear History tool, whose announcement was proposed on the fly by Zuckerberg, is still not available nearly a year after he introduced it on stage at F8.

“We really had nothing to show anyone. Mark just wanted to score some points.”

“If you watch the presentation, we really had nothing to show anyone,” said one person, who was close to F8. “Mark just wanted to score some points.”

Zuckerberg and Chief Operating Officer Sheryl Sandberg do not make judgment calls “until pressure is applied,” said another former employee, who worked with Facebook's leadership and declined to be named for fear of retribution. “That pressure could come from the press or regulators, but they’re not keen on decision-making until they’re forced to do so.”

“Whether it is with genocide or false news, there are never going to be changes until the pressure becomes too great,” they added.

Facebook has long portrayed itself as an advocate for user rights. But former employees and critics say the company's true ethos has often been in opposition to this. Facebook's communications around privacy have historically been opportunistic and protectionist, deployed to cover up for the last transgression from its "move fast and break things" ideology — from the 2007 Beacon program, which allowed companies to track purchases by Facebook users without their consent, to the 2010 loophole that allowed advertisers to access people’s personal Facebook information without permission.

“Sometimes we move too fast — and after listening to recent concerns, we're responding,” Zuckerberg wrote in a 2010 op-ed in the Washington Post. That was just before the company agreed to a Federal Trade Commission consent decree, which charged that Facebook had routinely changed users’ privacy settings in order to obtain their information. The company is currently negotiating with the FTC, which has been investigating whether or not Facebook violated the terms of that consent decree and should be punished.

Last spring, as Facebook dealt with fallout from Cambridge Analytica, compliance with Europe’s new General Data Protection Regulation (GDPR), and renewed attention on how it tracks all internet users following Zuckerberg’s ten hours of congressional testimony in April, the company’s ask-forgiveness-not-permission playbook was in plain view. It took out full-page newspaper apologies, placed its chief executive on podcasts and televised interviews, and sent Sandberg to meet with state attorneys general and lawmakers behind closed doors.

“My worry is that Facebook is doing anything it can to garner goodwill and diffuse concern,” said Ashkan Soltani, the FTC’s chief technologist from October 2014 to November 2015. “I’m not sure of the sincerity of those actions since, historically, the company uses privacy selectively and strategically.”

“Whether it is with genocide or false news, there are never going to be changes until the pressure becomes too great.”

Five former employees who spoke with BuzzFeed News say they are skeptical of that goodwill effort, with three noting that the external messaging and marketing around privacy has only become a focus for executives during the last 12 months. One pointed out that an international ad campaign last spring, focused on how “fake news” and “clickbait” “is not your friend,” was quickly repurposed to address “data misuse” days after the first Cambridge Analytica stories broke. Two highlighted Zuckerberg’s desire to rush out a Clear History announcement ahead of F8. Some ridiculed the company’s privacy pop-up store in New York City’s Bryant Park in December, which was built to show that “privacy is the foundation of our company.”

“It’s public relations,” said one former employee. “It’s, ‘Hey, look at this shiny thing, please don’t pay attention to this mushroom cloud.’” This also appears to be the case with Clear History, which, while touted by both Sandberg and Zuckerberg in recent months as an example of Facebook’s commitment to getting privacy right, has yet to actually launch.

Facebook disagreed with the characterization that privacy promises are used to distract from the real problems.

“We know we have work to do to regain people's trust, and it's why we've strengthened our teams, created a new privacy and data use organization, built new tools, and set clearer policies designed to better protect people's information,” a Facebook spokesperson said in a statement to BuzzFeed News. “We still face legitimate scrutiny, but we’re not the same company we were even a year ago, and we’re determined to do more to keep people safe across our services.”

Former employees, however, are not willing to give the company the benefit of the doubt.

“They have a long-running strategy of using communications to disagree and push this counter-narrative against any criticism,” one said. “They’re playing the same game they’ve always played, but the challenge for them is that the world has changed and privacy concerns are increasing dramatically.”

Risk Factors

With his company staring down what could reportedly be a multibillion-dollar fine from the FTC for violating its 2011 consent decree, Zuckerberg is acutely aware of what public perception around privacy could do to Facebook’s business. In its 2018 annual report, the company outlined not only the risks associated with changing privacy laws including GDPR and the recently passed California Consumer Privacy Act, which the company lobbied against, but also the danger of becoming the media’s punching bag if news outlets dug into Facebook’s practices around data use and sharing.

“Unfavorable publicity regarding, for example, our privacy practices, terms of service, product changes, product quality, litigation or regulatory activity, government surveillance, the actions of our advertisers, the actions of our developers whose products are integrated with our products ... has in the past, and could in the future, adversely affect our reputation,” the company stated in a January financial filing. The statement goes on to outline the “intense media coverage” surrounding Cambridge Analytica and the possibility of negative publicity to adversely affect the company’s size, engagement, user loyalty, and, in turn, revenue.

While Facebook continues to grow — its sales, profit, and monthly active users (MAU) all increased in 2018 — there are plenty of signs that it may not be able to continue on the path. Multiple surveys have shown that users are losing trust in the social network, including one where the company ranked last among brands including Amazon, Google, Visa, and Comcast that handle personal data. And that mistrust could translate to unrest among investors.

“The list of problems the company is grappling with is vast, including complicity in a genocide, enabling social and political instability in different countries around the world, the unwitting sharing of consumer data and antagonized legislators in the US, the UK, Europe and beyond,” Pivotal Research’s Brian Wieser, one of the more bearish Facebook analysts, wrote in a January report.

One former employee noted that Facebook’s executives historically only took privacy seriously if problems affected the key metrics of daily active users, which totaled 1.52 billion accounts in December, or monthly active users, which totaled 2.32 billion accounts. Both figures increased by about 9% year-over-year in December.

“If it came down to user privacy or MAU growth, Facebook always chose the latter.”

“If it came down to user privacy or MAU growth, Facebook always chose the latter,” the person said. That source pointed to internal Facebook emails obtained and released by a UK parliamentary inquiry that showed, among other things, the company’s then–deputy chief privacy officer Yul Kwon discussing how to allow Facebook’s Android app to read a phone’s call logs without triggering a permission pop-up.

Ironically, Facebook’s leaders were worried about the public relations scenario that could have occurred if Android’s permissions did appear, as they were intended, to ask users to consent to the app reading their call logs. Instead of asking for less access, however, they sought a workaround so that they could still suck up the data without making people aware that they were doing so.

“This is a pretty high risk thing to do from a PR perspective but it appears that the growth team will charge ahead and do it,” one Facebook employee, Michael LeBeau, wrote in early 2015. “We think the risk of PR fallout here is high.”

The fallout, however, came more than three years after those emails, after UK parliamentarians obtained them and used them to bolster their case that Facebook operated as a “digital gangster” with little regard for law or scrutiny in a report earlier this month.

“It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws,” the House of Commons Digital, Culture, Media, and Sport (DCMS) Committee wrote in what is perhaps the strongest rebuke of the company by a governing body to date.

“Highly Decentralized Company”

Two people who used to work at Facebook said that it’s hard to take the company’s apologies and commitments to privacy seriously after witnessing its attempt to get ahead of outlets preparing to publish stories about Cambridge Analytica. Last March, the New York Times, the Guardian, and the Observer were readying stories about a former employee at the political consulting firm who had evidence that Cambridge Analytica had illicitly obtained data on millions of Facebook users and deployed that information for American political campaigns.

The outlets, according to multiple people familiar with the situation, had been in communication with Facebook about their stories for at least a week, and the company’s public relations team was well-aware that the pieces would be published on the weekend. In response, Facebook’s communications team decided to get ahead of the stories, publishing a blog post from the company’s deputy general counsel the preceding Friday about suspending accounts associated with Cambridge Analytica.

“We were essentially scooping the news,” one source said, explaining that Facebook was trying to soften the blow of any future story on the matter.

Despite the attempt, the blog post, which was picked up by major news outlets, acted as an accelerant for the stories that would publish the next day. That, along with a legal threat sent to Guardian Media Group in the UK, compounded the attention and turned Cambridge Analytica into a full-blown maelstrom.

“We were essentially scooping the news.”

Three former employees who spoke to BuzzFeed News said that there are people at Facebook who do want to put the the social network’s users first. One acknowledged Facebook’s constant privacy scandals but attributed those mistakes to a “highly decentralized company” where issues arise “less because of an act of malice, and more as unintended consequences.” Another said there are “privacy purists who care about this deeply” but that “there is an equal number of people that looks at privacy as a lever to pull to improve user sentiment and, in turn, revenue.”

Zuckerberg’s thinking fluctuates between both camps, that person said, favoring privacy when he realizes his company’s actions have triggered a backlash.

Last April, technology news site TechCrunch found that Zuckerberg and other executives had been given “special treatment” by employing a tool that deleted old messages from both a sender’s and receiver’s mailboxes. Unlike Messenger for normal users, which retains messages indefinitely and gives users no option to delete old messages, Zuckerberg had been selectively eliminating threads from 2014 and earlier, sparking outrage among some employees who didn’t understand why their chief executive had a privacy feature that wasn’t available to all users.

Two sources recalled the backlash at an all-hands meeting, where several employees confronted the chief executive for secretly claiming special platform privileges that belied the company’s supposed internal dedication to transparency. In response, Zuckerberg told employees at the all-hands that the message deletion practice had come into effect because of the 2014 Sony Pictures hack to protect executives’ communications. A Facebook spokesperson declined to comment on the substance of internal discussions.

One source called the answer a “dodge” scripted by the company’s internal communications team, and noted Zuckerberg’s hypocrisy was in plain view: He protected his own privacy, while publicly diminishing privacy concerns that emerged in the wake of the Cambridge Analytica scandal.

Ever reactive, Facebook quickly announced plans to launch an “unsend message” tool, which would allow users to retract messages within 10 minutes of sending them. That feature, which is not at all the same thing as Zuckerberg’s ability to delete years-old messages, is now reportedly being tested in certain markets.

For one former employee, this incident highlights a systemic issue at a company that is worth more than $450 billion.

“I do think these problems have been traditionally viewed as communications, policy, and legal problems, not necessarily as core product challenges, and that’s likely why they’re in this problem,” the source said. “Ultimately when comms and policy and legal have been called on to solve an issue, it means that we’re already in a crisis.”

Other sources told BuzzFeed News that Facebook executives continue to view the problems of 2018 fundamentally as communication issues. They said some insiders among leadership and the rank and file could not understand how Facebook had become the focus of so much public ire and floated the idea that news publications, who had seen their business models decimated by Facebook and Google, had been directed to cover the company in a harsher light.

Last summer, the company invited a number of publications — including the Wall Street Journal, New York Times, and BuzzFeed News — for what one person called an off-the-record “media reset.” Executives including Sandberg, chief product officer Chris Cox, and augmented and virtual reality vice president Andrew “Boz” Bosworth met reporters and editors in an attempt to rebuild relationships with outlets that had been covering Facebook critically. (The author of this story learned of these talks independent of BuzzFeed News’ meeting with Facebook’s executives.)

Many of the former Facebook insiders who spoke with BuzzFeed News struggled to understand why there have been few management changes after that past year. “Certain leaders have been making bad calls,” one said, leaving the company in “crisis after crisis.” Yet aside from an executive shuffle where leaders were reorganized into different positions in May, few people, besides policy and communications head Elliot Schrage, have been shown the door. (And even Schrage still technically remains at the company in a special projects advisory role.)

“There’s an abdication of responsibility by the two at the top that runs deep — all the way down to junior leadership looking the other way,” another former employee said.

The UK’s DCMS committee agreed. “The management structure of Facebook is opaque to those outside the business and this seemed to be designed to conceal knowledge of and responsibility for specific decisions,” it wrote.

View this video on YouTube

youtube.com

Mark Zuckerberg describes the Clear History tool at Facebook's F8 developers conference last May.

Where Is Clear History?

Late last year, Facebook decided to allay privacy concerns by hiring some of its biggest critics. In December, the company scooped up Nate Cardozo, formerly of the Electronic Frontier Foundation; Robyn Greene, from New America’s Open Technology Institute; and Nathan White, of Access Now, a digital rights foundation.

“After the privacy beating Facebook’s taken over the last year, I was skeptical too,” Cardozo, who once called the company’s business model “creepy,” wrote in a Facebook post announcing his new position. “But the privacy team I’ll be joining knows me well, and knows exactly how I feel about tech policy, privacy, and encrypted messaging.”

“Hiring new people doesn't absolve Facebook for past bad practices, or guarantee future improvements,” Estelle Massé, a senior policy analyst for Access Now, wrote in an email to BuzzFeed News. “Given the legacy of Facebook’s policies and practices, it will be difficult to right this ship.”

Thus far, Facebook’s public discussions of Clear History appear to have been more about communications strategy than charting a new course. In a Facebook post looking back on 2018, Zuckerberg pointed to the tool as one that would “give people more transparency” while Sandberg highlighted it to show Facebook’s willingness to change during a speech at the World Economic Forum in Davos, Switzerland, last month.

Still, nine months after its initial announcement, Clear History is nowhere to be found. A Facebook executive conceded in a December interview with Recode that “it’s taking longer than we initially thought” due to issues with how data is stored and processed. The company will now reportedly start testing the tool in the spring led by a new privacy product unit Zuckerberg created last May amid various scandals.

“We want to make sure this works the way it should for everyone on Facebook, which is taking longer than expected,” the company said in a statement to BuzzFeed News.

It’s unclear if new high-profile hires, like Cardozo and Greene, will work with Facebook’s new privacy unit or if they will be involved with Clear History. (Zuckerberg did say last May that the company would be working with “privacy advocates” on its new tool to “make sure we get it right.”) It has reached out to groups like Access Now, the Electronic Frontier Foundation (EFF), and the Center for Democracy and Technology (CDT), as well as academics.

Sources confirmed that CDT and EFF were advising Facebook on its Clear History tool, but could not disclose specifics of their meetings due to nondisclosure agreements. Access Now’s Massé confirmed Facebook had reached out on a number of issues, including Clear History, in the last few months, but called the conversations “punctual and limited.”

“Despite repeated statements and apologies from the company, we are not seeing a shift in Facebook data practices or an attitude that would suggest that they take data protection seriously,” she said. “What we are seeing so far are reactionary measures in an attempt to sway public opinion, rather than a fundamental shift in the way the company considers users’ rights to privacy and data protection.”

“There could be a gap between what Facebook says it’s deleting and what they’re actually deleting.”

Privacy experts also pushed back on the idea that Clear History could be a cure-all for Facebook’s privacy ails. For one, it’s still unclear what the tool will allow users to delete and, as the CDT’s Natasha Duarte notes, features like these don’t necessarily guarantee better privacy. “The tool may be able to delete information that Facebook holds about a user’s interaction with other websites, but inferences from those interactions may already be incorporated into Facebook’s algorithm,” she said.

For example, imagine if a user visited a site for Dyson vacuums and Facebook registered that interaction through a tracker. Clear History might let a user remove information about that Dyson site visit from Facebook’s servers, but if that information has already been collected and algorithmically processed into a preference for ads about other household cleaning products, clearing history doesn't mean much. As Duarte explained: “There could be a gap between what Facebook says it’s deleting and what it is actually deleting.”

Former insiders were also concerned about how much Facebook would emphasize or promote Clear History after launch, noting that past privacy features have sometimes been introduced with minimum functionality and high amounts of friction to possibly discourage users. One former employee cited a feature that allows users to download all their Facebook information, but noted that it was hard to find and created file formats that were hard to read and share. Another recalled how the company unveiled a tool for “Nearby Friends,” but required “an inordinate number of clicks” to turn off the feature, which then defaulted to only pausing a user’s involvement for a short period of time.

“It seemed like a douchey move, and it was unclear if it was a deliberate choice or poor design,” the employee said. (A Facebook spokesperson disagreed with these characterizations and said the company builds controls “so that people will be able to easily find and use them.”)

Given the last 12 months, Facebook has lost that benefit of the doubt, according to privacy experts, and Clear History may be too little, too late. Groups including Access Now and CDT are calling for policymakers to step in, and even the social network’s executives seem resigned to some type of eventual privacy law in the US.

“Facebook has made good on just about every opportunity to lower expectations that it would protect user privacy without a government forcing it to,” Massé told BuzzFeed News. “It is high time for the US to adopt comprehensive data protection legislation to bring sectorwide safeguards for our personal information.”

The company’s track record speaks for itself, said Gennie Gebhart, a consumer privacy researcher at EFF. Gebhart, who’s been in discussions with Facebook about Clear History, noted that she maintains a certain skepticism that the company is capable of deeper change, and compared its past privacy promises to rearranging deck chairs on the Titanic.

“They’re focused on solving the problem of needing to be seen like they're doing something, rather than solving the actual problem,” she said. “And that needs to change.”●


Topics in this article

Skip to footer