Skip To Content
BuzzFeed News Home Reporting To You

Amazon Won’t Say It Doesn’t Work With ICE

Public scrutiny of US law enforcement’s use of Amazon’s facial recognition tech has grown in response to reports that the technology could at times be inaccurate, and even racially biased.

Posted on December 12, 2018, at 6:50 p.m. ET

Amazon once again declined to answer questions about whether it works with US Immigration and Customs Enforcement (ICE), even as civil rights concerns around the use of the tech giant’s facial recognition tools by US law enforcement and government agencies continue to grow.

Approached by BuzzFeed News after the subject came up during a hearing between Amazon policy representatives and New York City Council members Wednesday morning, Amazon would not say outright that the company does not work with ICE. A company spokesperson said it was policy for Amazon to not talk about its clients.

The question of whether Amazon provides facial recognition services to ICE arose when Brian Huseman, Amazon’s vice president for public policy, brought up the company’s Rekognition platform and tools in response to a question from council Speaker Corey Johnson about its work with ICE. Johnson did not mention Rekognition in the question, but Huseman brought up the service voluntarily.

“What is Amazon’s relationship with ICE?” Johnson asked.

“We provide our Rekognition service to a variety of government agencies, and we think that the federal government should have access to the best available technology,” Huseman testified.

Amazon testifies that they believe ICE should have "the best available technology" as they provide facial recognition software to facilitate deportations. We are a city of immigrants. This is shameful. #NoAmazonNYC @MaketheRoadNY @nychange @caaav @MPower_Change

“I see how Amazon could say that the company didn’t say, ‘yes, [we confirm],’” New York City Council member Brad Lander, who was at the hearing, told BuzzFeed News in an interview. “But they gave a non-denial that anyone listening would take as a confirmation.”

“So I, then, in my follow-up question a few minutes later, began by saying, ‘You affirmed that Amazon provides facial recognition technology to ICE,’” Lander continued. “Huseman had an opportunity to correct the record. He could have said, ‘No, we did not affirm.’”

“Is it true that he said out of his mouth, ‘Yes, I affirm that Amazon is providing facial recognition technology to ICE’? They did not say those words. But they gave in their first response, a non-denial, and in the second, given an opportunity, he chose not to correct the record.”

In an October story by the Daily Beast and the Project on Government Oversight that detailed documents showing Amazon pitched ICE on its facial recognition tools last summer, a spokesperson for ICE said in a statement that “publicly available procurement data” shows that the government agency does not have a contract with Amazon Web Services. "It would be inappropriate to comment on this specific piece of technology,” the statement continued.

Amazon, through its cloud-computing arm, Amazon Web Services, does help support the back-end infrastructure for ICE — although, as far as the public knows, it does so indirectly, by providing cloud services to third parties that do have direct contracts with ICE. In June, a group of Amazon workers wrote a letter demanding that the company end the sale of its facial recognition technology to government agencies. “We … know that Palantir runs on [Amazon Web Services]. And we know that ICE relies on Palantir to power its detention and deportation programs,” they wrote.

“We refuse to build the platform that powers ICE, and we refuse to contribute to tools that violate human rights,” the letter continued.

In October, the activist organization Mijente published a report noting that a company called Forensic Logic makes a software program used by more than 5,000 law enforcement agencies and is hosted on Amazon Web Services. The software, the report noted, is “designed to be compatible with federal immigration databases,” and would allow ICE “unprecedented access to information about employers, associates, and hangout spots.”

“Amazon making their facial recognition available to ICE would be of the worst examples of big tech collaboration with Trump’s deportation agenda,” Jacinta Gonzalez, a field organizer for Mijente, told BuzzFeed News in a statement. “Facial surveillance tools are some of the most dangerous products on the market. … We call on workers at Amazon to say, ‘Not in our name,’ and refuse to work on this product.”

Public scrutiny of US law enforcement’s use of Amazon’s facial recognition tools has grown in response to reports that the technology could at times be inaccurate, and even racially biased. In July, the nonprofit organization American Civil Liberties Union ran a test that showed Rekognition falsely matched 28 members of Congress with arrest mugshots — and the false matches were disproportionately people of color. In October, BuzzFeed News published an investigation into how the Orlando Police Department was using the tech in a second pilot in Florida, which revealed that the pilot program lacked internal and external policy guidelines, and was initiated without any hands-on training from Amazon for police officers.

“Amazon shouldn’t be selling the Trump administration unproven, dangerous, and biased face recognition technology that is ripe for abuse,” Donna Lieberman, executive director of the New York Civil Liberties Union, said in a statement in reaction to the events today. “As the company prepares to move to Queens, a home to so many vibrant immigrant communities, Amazon has no business cashing in ICE’s targeting and harassment of New Yorkers.”

And as Amazon refuses to say outright whether the company does or does not work with ICE, discontent within the company has also festered. In November, Amazon executives defended its facial recognition technology at an all-hands staff meeting after employees raised civil rights concerns about the tech’s potential misuse.

“It’s hard to trust that harm and abuse [by Amazon Rekognition] can be prevented if it is only post-mortem and through the Terms of Service,” an Amazon employee who requested anonymity told BuzzFeed News at the time.

ADVERTISEMENT