Here's How Facebook Got Into This Mess: A Timeline
A data firm collected tens of millions of people’s Facebook data without their consent. Here’s how that happened.
Over the weekend, Facebook came under fire after reports that Cambridge Analytica, a data analytics firm that worked with President Trump’s campaign, had obtained personal data from tens of millions of Facebook users without their consent.
Between 2010 and 2015, Facebook allowed third-party apps to collect extensive personal data from users and their friends. After realizing in 2015 how that data could be abused, Facebook shut off access and updated its platform, but it was too late — Cambridge Analytica had already collected millions of people’s data. Here’s why that happened and how Facebook got itself into this mess.
2010 — Facebook launches Open Graph API version 1.0 for developers, which allows extensive user data collection.
On April 21, 2010, Facebook opened its social graph to third-party apps, which could request a large amount of data, including users' friends' info, without having to communicate the reasoning for acquiring that data.
These apps could acquire a comprehensive dataset that included a user’s public profile (their name, gender, location, time zone, and Facebook ID), as well as their friends’ names, bios, birthdays, education, political views, relationship status, religion, notes, chat online status, and more. With extended permissions, developers could even gain access to a user’s private messages. The full list of application permissions can be found on page five of this study.
2012 — Facebook tweaks the design of its permissions request page to make it less apparent to users that they were handing over more personal information.
Screenshots from 2012 show how Facebook changed the options on the permissions page for games from “Allow” and “Don’t Allow” to simply “Play Game,” the latter of which did not communicate clearly to users that they were opting into sharing data. Additionally, in the new design, Facebook hid its explanation of what it considered “basic information” underneath a small “?” symbol that users needed to hover over to read.
2013 — Cambridge University researcher Aleksandr Kogan develops a quiz app using Graph API. Over 270,000 users take the survey, giving away their own data, plus friends’ data.
Researcher Aleksandr Kogan, under the guise of a firm called “Global Science Research,” hired people to take his personality quiz on Amazon Mechanical Turk, a site that connects workers with jobs that require simple tasks for a fee. According to the Intercept, Kogan paid $1 or $2 to workers to complete the survey using their Facebook accounts.
Because of the way Facebook’s third-party platform worked at this time, the over 270,000 survey participants who consented to hand over their data also gave Kogan access to tens of millions of their friends’ Facebook data. Kogan told Facebook and its users that the data would be anonymized, but it wasn’t.
Many people have described Cambridge Analytica’s collection and use of people’s data as a “data breach,” but in now-deleted tweets, Facebook executive Alex Stamos said Kogan did not break into any systems: “He did, however, misuse that data after he gathered it, but that does not retroactively make it a ‘breach.’”
2014 — Facebook publishes a study about a News Feed experiment to influence users’ emotions, and people are pissed.
The study’s publication in June 2014 prompted one of the first instances of widespread outrage over privacy concerns on Facebook. Users were furious at Facebook for having performed the research without more explicit consent.
Facebook subjected nearly 700,000 users to a 2012 study in an attempt to discover how to make people feel happier or sadder, based on what Facebook showed them on their News Feed. Facebook said in its study that it followed its own data use policy, which states that all users agree, before creating an account, to informed consent for this research. However, Forbes reported that the “research policy” clause in the site’s terms was added in May 2012, four months after the study took place.
2014 — Facebook begins deprecating Graph API version 1.0.
During Facebook’s annual F8 conference in 2014, the company announced it was winding down Graph API, which significantly limited the amount of Facebook data third-party applications had access to. Additionally, apps would be required to get approval from Facebook before they could request sensitive data.
A press release from the event said, “We’ve heard from people that they are worried about sharing information with apps, and they want more control over their data.”
However, permissions for Facebook apps enabled between 2010 and 2015 were not retroactively limited, and it’s likely third parties stored data collected from Facebook users on their own servers.
2015 — Facebook shuts down Graph API v1.0.
On April 30, 2015, Facebook shut down access to its Graph API and updated its platform to give away less data, especially about users' friends. Facebook also offered more granular control over what information its users shared with developers, and a new login screen with an added “Edit the info you provide” link.
2015 — The Guardian reports that Ted Cruz’s presidential campaign used “psychological data” from millions of Facebook users. Facebook then takes legal action to force Cambridge Analytica to destroy Facebook users’ data.
Through an inquiry from the Guardian, Facebook learned that Aleksandr Kogan, the researcher behind Global Science Research, had sold the Facebook user dataset he acquired for “research purposes” to a firm called Strategic Communications Laboratories, which later became Cambridge Analytica.
Facebook verified that the firm had acquired user data, but did not publicly acknowledge it. The social media network then legally pressured Cambridge Analytica to destroy “all improperly collected data,” according to a statement provided by a company spokesperson. According to Facebook executive Andrew Bosworth, the firm “certified in a legal document that they had deleted the data,” which is why they were not suspended from the platform at the time.
2016 — Then-presidential candidate Donald Trump’s campaign hires Cambridge Analytica.
Donald Trump’s campaign began investing heavily in Facebook advertising ahead of the presidential election. A super PAC supporting Donald Trump directed an anti–Hillary Clinton video ad aimed at specific audience segments identified by Cambridge Analytica. An investigation by Channel 4 in Britain showed Mark Turnbull, a managing director of Cambridge Analytica, claiming the firm was responsible for the “Defeat Crooked Hillary” video campaign on Facebook.
March 17, 2018 — The New York Times and the Guardian report that Cambridge Analytica still possesses data it inappropriately gathered from as many as 50 million Facebook users.
Cambridge Analytica did not destroy Facebook user data in its entirety, after all. After the reports were published, Facebook suspended Cambridge Analytica, as well as Christopher Wylie of Eunoia Technologies, a vendor contracted by the firm. Wylie provided details of Cambridge Analytica’s use of Facebook data to the New York Times and the Guardian.
March 20, 2018 — The Federal Trade Commission launches an inquiry into Facebook and Cambridge Analytica.
The FTC is investigating whether Facebook violated a settlement reached with the government agency in 2011 over user privacy protections. US lawmakers also called for CEO Mark Zuckerberg to testify before Congress.
March 21, 2018 — Zuckerberg breaks his silence and announces a new tool that offers an easy way to revoke apps’ permissions to access your data.
"We have a responsibility to protect your data, and if we can't then we don't deserve to serve you. I've been working to understand exactly what happened and how to make sure this doesn't happen again," he said in a post.
Zuckerberg announced that Facebook would no longer allow app developers access to its users data after three months of inactivity, that it would reduce the information people are required to give app developers, and it would audit all apps with access to large amounts of data before 2014, when the platform dramatically reduced the amount of data shared with third parties.