Microsoft Says It Won't Sell Facial Recognition To The Police. These Documents Show How It Pitched That Technology To The Federal Government.

Last week, Microsoft said it would not sell its facial recognition to police departments. But new documents reveal it was pitching that technology to at least one federal agency as recently as two years ago.

This photo shows people at a restaurant with squares around their faces, ostensibly highlighting the ability of Microsoft's Azure facial recognition service.

In early June, Microsoft joined a growing list of tech companies that pledged not to sell facial recognition technology to police departments until the controversial technology was federally regulated. But that announcement left a loophole: selling facial recognition to the federal government.

Newly released emails show the company has tried to sell the controversial technology to the government for years, including to the Drug Enforcement Administration in late 2017.

Those documents, obtained by the American Civil Liberties Union via a public records lawsuit, provide a rare look into how the Redmond, Washington–based company tried to sell artificial intelligence services to federal agencies six months before its July 2018 call for "public regulation and corporate responsibility" around facial recognition. Last week, Microsoft said "we do not sell our facial recognition technology to US police departments today” and committed not to do so “until there is a strong national law grounded in human rights.”

But that pledge did not address any potential or ongoing relationships with federal agencies. When asked by BuzzFeed News, Microsoft did not immediately provide comment on whether it has provided or is currently providing its facial recognition technology to federal law enforcement agencies.

The emails obtained by the ACLU show that the company pitched facial recognition as a law enforcement tool to the DEA in late 2017 as the company pushed to expand its offerings on its government cloud platform, Microsoft Azure Government Cloud. In September of that year, an individual whose name has been redacted, but listed their title as the DEA’s chief technology officer, stated that he was hosting the Microsoft Cognitive Services Group “to discuss use-cases for their Media Services.”

“As you may be aware, Microsoft Azure has many of these services (Translation, Transcription, Video Processing, Facial Recognition, etc.) running in the Public Azure,” the person wrote on Sept. 15, 2017. “Microsoft has only some of these services running in the Microsoft Azure Government (MAG) Cloud and they are looking at what else needs to be transitioned over to MAG.”

The person later noted that MAG was approved for “Law Enforcement Sensitive things” and that they wanted to create a pilot project to test a variety of video and audio recording technologies.

A DEA spokesperson declined to comment on its conversations with Microsoft or the agency’s tests or deployment of facial recognition.

Other emails show that DEA representatives visited Microsoft’s office in Reston, Virginia, in November 2017 to see a demonstration of a suite of products including translation services, document transcription, “optical character recognition in video,” and Azure facial recognition. In a follow-up message after the meeting, a Microsoft employee, whose name was redacted, gave a brief overview of all the demos his team showed the agency including “Face API: Identify similar faces, develop a face database.”

“Please let us know when and how we can take the next step on a prototype,” they wrote. Based on the emails, it’s unclear if any prototype was built.

Eight months after those meetings, Microsoft President Brad Smith penned a blog post calling for “thoughtful government regulation and for the development of norms around acceptable uses” surrounding facial recognition.

“If there are concerns about how a technology will be deployed more broadly across society, the only way to regulate this broad use is for the government to do so,” he wrote, before acknowledging the possibility of racial profiling and misidentification.

Despite those concerns, Microsoft’s representatives continued to pitch facial recognition as part of its Azure Government offering. In November 2018, a “Sr. Microsoft SME,” whose name was redacted in the email, sent another note to a DEA representative requesting a meeting. Azure has a number of relationships with federal agencies including the Department of Defense and Immigration and Customs Enforcement (ICE).

While it’s not clear if the DEA moved forward with Microsoft’s Azure AI offerings, the fact that Microsoft pitched such services in the first place “is concerning,” Kade Crockford of the ACLU Massachusetts told BuzzFeed News. In October, the ACLU sued the Department of Justice, FBI, and DEA after those agencies failed to comply with a public records request regarding their use of facial recognition and other biometric tracking technology.

Microsoft's recent decision not to provide facial recognition to police departments is “a positive step,” said Crockford, noting that it’s what civil rights organizations have demanded for years after studies showed the technology has high rates of misidentification among racial minorities.

“The DEA has a long history of racially disparate or racist practices and has been engaged in wildly inappropriate mass surveillance,” they said.

BuzzFeed News previously reported that individuals associated with the DEA tested Clearview AI, a controversial facial recognition software that’s been built using billions of photos scraped from social media sites including Facebook, Instagram, and Twitter. As of February, more than 20 users associated with the DEA have run about 2,000 searches according to data viewed by BuzzFeed News.

Following nationwide protests of racial injustices and police brutality faced by Black people, companies have pulled back on their facial recognition offerings. Earlier this month, Amazon said it would place a one-year moratorium on selling its biometric face identification service, Rekognition, to police, while IBM said it would stop developing or researching facial recognition.

When asked by BuzzFeed News, however, Amazon, IBM, and Microsoft refused to disclose which police departments, if any, had previously used their facial recognition services.

Earlier this month, BuzzFeed News also reported that the Justice Department gave the DEA permission “to enforce any federal crime committed as a result of the protests over the death of George Floyd.”


Skip to footer