Why Facebook Shutting Down Its Old Facial Recognition System Doesn’t Matter

Facebook just made a big deal of shutting down its original facial recognition system. But the company’s pivot to the metaverse means collecting more personal information than ever before.

On Monday morning, Meta — the company formerly known as Facebook — announced that it would be shutting down “the Face Recognition system on Facebook,” a technology that has been raising privacy alarms since it debuted. In a blog post, the company described the move as “one of the biggest shifts in facial recognition usage in the technology’s history.” On Twitter, outgoing CTO Mike Schroepfer and incoming CTO Andrew Bosworth, who previously oversaw Facebook’s Oculus virtual reality division, called the announcement a “big deal” and a “very important decision.” The Electronic Frontier Foundation deemed it “a testament to all the hard work activists have done to push back against this invasive technology.”

But a review of Meta and Facebook’s VR privacy policies, and the company’s answers to a detailed list of questions about them, suggest the company’s face identification technology isn’t going anywhere. And it is only one of many invasive data collection methods that may be coming to a metaverse near you. (Disclosure: In a previous life, I held policy positions at Facebook and Spotify.)

Meta’s current privacy policies for VR devices leave plenty of room for the collection of personal, biological data.

Facebook's recent announcement that it is shutting off its controversial facial recognition system comes at a difficult time for the company, which is facing significant regulatory scrutiny after years of bad press recently inflamed by a high-profile whistleblower.

But the moment may also be an opportune one. The company is shifting its focus to virtual reality, a face-worn technology that, by necessity, collects an enormous amount of data about its users. From this data, Meta will have the capacity to create identification and surveillance systems that are at least as powerful as the system it’s putting out to pasture. Just because it can create those systems doesn’t mean it will. For the moment, though, the company is leaving its options open.

The fact is: Meta intends to collect unique, identifying information about its users’ faces. Last week, Facebook founder Mark Zuckerberg told Stratechery’s Ben Thompson that “one of the big new features” of Meta’s new Cambria headset “is around eye-tracking and face-tracking.” And while the platform has “turned off the service” that previously created facial profiles of Facebook users, the New York Times reported that the company is keeping the algorithm on which that service relied. A Meta spokesperson declined to answer questions from BuzzFeed News about how that algorithm remains in use today.

Meta may have shut down the facial recognition system on Facebook that raised so many concerns, but given that it intends to keep the algorithm that powered that system, there is no reason the company couldn’t “simply turn it on again later,” according to David Brody, senior counsel at the Lawyers’ Committee for Civil Rights Under Law.

There is no reason the company couldn’t "simply turn it on again later."

Meanwhile, Meta’s current privacy policies for VR devices leave plenty of room for the collection of personal, biological data that reaches beyond a user’s face. As Katitza Rodriguez, policy director for global privacy at the Electronic Frontier Foundation, noted, the language is “broad enough to encompass a wide range of potential data streams — which, even if not being collected today, could start being collected tomorrow without necessarily notifying users, securing additional consent, or amending the policy.”

By necessity, virtual reality hardware collects fundamentally different data about its users than social media platforms do. VR headsets can be taught to recognize a user’s voice, their veins, or the shading of their iris, or to capture metrics like heart rate, breath rate, and what causes their pupils to dilate. Facebook has filed patents concerning many of these data collection types, including one that would use things like your face, voice, or even your DNA to lock and unlock devices. Another would consider a user’s “weight, force, pressure, heart rate, pressure rate, or EEG data” to create a VR avatar. Patents are often aspirational — covering potential use cases that never arise — but they can sometimes offer insight into a company’s future plans.

Meta’s current VR privacy policies do not specify all the types of data it collects about its users. The Oculus Privacy Settings, Oculus Privacy Policy, and Supplemental Oculus Data Policy, which govern Meta’s current virtual reality offerings, provide some information about the broad categories of data that Oculus devices collect. But they all specify that their data fields (things like “the position of your headset, the speed of your controller and changes in your orientation like when you move your head”) are just examples within those categories, rather than a full enumeration of their contents.

The examples given also do not convey the breadth of the categories they’re meant to represent. For example, the Oculus Privacy Policy states that Meta collects “information about your environment, physical movements, and dimensions when you use an XR device.” It then provides two examples of such collection: information about your VR play area and “technical information like your estimated hand size and hand movement.”

But “information about your environment, physical movements, and dimensions” could describe data points far beyond estimated hand size and game boundary — it also could include involuntary reaction metrics, like a flinch, or uniquely identifying movements, like a smile.

Meta twice declined to detail the types of data that its devices collect today and the types of data that it plans to collect in the future. It also declined to say whether it is currently collecting, or plans to collect, biometric information such as heart rate, breath rate, pupil dilation, iris recognition, voice identification, vein recognition, facial movements, or facial recognition. Instead, it pointed to the policies linked above, adding that “Oculus VR headsets currently do not process biometric data as defined under applicable law.” A company spokesperson declined to specify which laws Meta considers applicable. However, some 24 hours after publication of this story, the company told us that it does not “currently” collect the types of data detailed above, nor does it “currently” use facial recognition in its VR devices.

Meta did, however, offer additional information about how it uses personal data in advertising. The Supplemental Oculus Terms of Service say that Meta may use information about “actions [users] have taken in Oculus products'' to serve them ads and sponsored content. Depending on how Oculus defines “action,” this language could allow it to target ads based on what makes us jump from fear, or makes our hearts flutter, or our hands sweaty.

"The VR industry is kind of in a 'magic eight ball' phase right now."

But at least for the moment, Meta won’t be targeting ads that way. Instead, a spokesperson told BuzzFeed News that the company is using a narrower definition of “actions” — one that does not include the movement data collected by a user’s VR device.

In a 2020 document called "Responsible Innovation Principles," Facebook Reality Labs describes its approach to the metaverse. The first of these principles, “Never Surprise People,” begins: “We are transparent about how our products work and the data they collect.” Responding to questions from BuzzFeed News, Meta said it will be upfront about any future changes, should they arise, to how it will collect and use our data.

Without better clarity about the data that Meta is collecting today, “customers cannot make an informed choice about when and how to use their products,” Brody told BuzzFeed News. More to the point, it's hard for the public to understand any future changes Meta might make to how it collects and uses our data if it's never explained exactly what it’s doing now.

Brittan Heller, counsel at the law firm Foley Hoag and an expert in human rights and virtual reality, put it differently: "The VR industry is kind of in a 'magic eight ball' phase right now. On questions about privacy and safety, the answer that flutters up says, 'Outlook uncertain: ask again later.'"

UPDATE

This article has been updated with additional information from Facebook.


Topics in this article

Skip to footer