With No Laws To Guide It, Here's How Orlando Is Using Amazon's Facial Recognition Technology

New documents obtained by BuzzFeed News reveal the most detailed picture yet of how the Orlando Police Department is using Amazon Rekognition, the tech giant’s facial recognition technology.

Walking around downtown Orlando, you might not notice the lightbulb-sized camera affixed to one of the traffic signal poles along the city’s palm tree–studded avenues. But it’s there, scanning all the same. If it sees you, the camera will instantly send a live video feed over to Amazon's facial “Rekognition” system, cross-referencing your face against persons of interest. It’s one of three IRIS cameras in the Orlando area whose video feeds are processed by a system that could someday flag potential criminal matches — for now, all the “persons of interest” are volunteers from the Orlando police — and among a growing number of facial recognition systems nationally.

In the US, there are no laws governing the use of facial recognition, and there is no regulatory framework limiting its law enforcement applications. There is no case law or constitutional precedent upholding police use of the tech without a warrant; courts haven’t even decided whether facial recognition constitutes a search under the Fourth Amendment. The technology is still plagued by inaccuracies.

But that hasn't stopped law enforcement from piloting these systems. According to documents obtained by BuzzFeed News, the city of Orlando — which initially allowed its original Rekognition pilot to expire amid growing public outcry — just embarked on a second pilot that allows for an unspecified but “increased” number of additional cameras.

The documents, obtained by BuzzFeed News via a Freedom of Information request, show that Amazon marketed its facial recognition tools to Orlando’s police department, providing tens of thousands of dollars of technology to the city at no cost, and shielding the Rekognition pilot with a mutual nondisclosure agreement that kept its details out of the public eye. More broadly, they reveal the accelerated pace at which law enforcement is embracing facial recognition tools with limited training and little to no oversight from regulators or the public.

"Providing customers with an opportunity to test technology with free credits is a common practice in the industry and something we offer to many of our customers with various AWS services," an Amazon Web Services spokesperson said in a statement. "Talking to organizations about products and new features under a non-disclosure agreement is also something we do frequently with many of our customers for the purposes of protecting intellectual property and competitive information. We continue to support our customers in the responsible use of the technology which includes providing publicly available best practices and documentation as well as ongoing guidance from our machine learning experts, all of which is standard for customer engagements.”

“Before integrating any new technologies into American life, we must be absolutely sure those innovations are imbued with our values. I am not convinced Rekognition passes that test.”

Amazon, however, declined to answer on-record several specific questions about Rekognition, among them: whether the system learns or otherwise improves from the video it ingests; whether Amazon provided Orlando law enforcement with hands-on training to help them understand how to use and interpret Rekognition (apart from emailed guidance and publicly available documentation); and how, exactly, the system processes and disregards faces that are not those of “persons of interest.”

“Before integrating any new technologies into American life, we must be absolutely sure that those innovations are imbued with our values,” Democratic Sen. Edward Markey, who sent a letter to Amazon CEO Jeff Bezos expressing his concern about the company's facial recognition services, told BuzzFeed News. “I am not convinced Rekognition passes that test.”

By contrast, decision-makers from Orlando seem prepared to go full steam ahead with tests of Amazon’s technology, though emails between city officials and Amazon reveal there were setbacks. Sgt. Eduardo Bernal, a public information officer for the city’s police department, told BuzzFeed News that Amazon provided no hands-on training on Rekognition, just standard documentation. Test results were flawed. There were miscommunications, including an embarrassing misstep that required an apology from Amazon — to the public and to Orlando PD.

To be clear, Orlando has not yet deployed a citywide facial recognition project. It is not currently processing the faces of pedestrians by comparing them to the faces of known criminals, nor are the alerts the system sends to police officers meant to detain suspicious “persons of interest.” But the city's Rekognition pilot is already testing how the technology would perform these kinds of tasks. Which means — to some extent — that the idea of a public-facing facial recognition database that automatically scans the city for possible criminal matches has already won.

"When people live in societies like that, it clamps down on their freedom of speech."

“What’s at stake here is our expectation of privacy in public, through our movements in time,” Clare Garvie, an associate at Georgetown Law's Center on Privacy and Technology, told BuzzFeed News. Real-time facial recognition, she said, is more closely related to real-time surveillance. “It doesn’t just enable location tracking — it enables relationship tracking and the ability of law enforcement to identify groups of people, instantaneously, at any given point in time.”

In other words, a network of cameras would mean individuals could be tracked as they moved about the city. By detecting these persons of interest as well as their patterns of movement, such a system could use that data to figure out who they associated with — and how those people moved across the city as well.

“When people live in societies like that, it clamps down on their freedom of speech,” said Jennifer Lynch, a senior staff attorney with the digital rights group Electronic Frontier Foundation, pointing to the NYPD’s surveillance of Muslim communities in New York City and beyond as a stark example. “People were less likely to talk to people they didn’t know in their neighborhood or in their mosques, at the risk of being associated with a suspected person,” Lynch said. They also altered their own behavior, and self-censored. “This type of surveillance has a really strong impact on communities.”

Documents obtained by BuzzFeed News show the initial rollout of Orlando’s Amazon Rekognition pilot was marked by internal miscommunication that led to both the city of Orlando and Amazon Web Services — Amazon’s cloud computing arm that offers its facial recognition tools — presenting confusing and contradictory information about the pilot to the public.

After the contract between the City of Orlando and Amazon was finalized in December 2017, documents show that in mid-February a team from Amazon Web Services spent two days in Orlando connecting the city’s video feeds to AWS Rekognition for a “proof of concept” project. This was Orlando’s first-phase pilot of Amazon Rekognition: video streams from five cameras inside city buildings and three more outside in downtown Orlando, all working to identify the faces of the seven volunteer officers when they walked by.

Amazon cautioned the Orlando Police Department early on that the system would give mixed results, documents show. “There are many factors that can affect the outcome,” an Amazon Web Services employee from its enterprise sales division wrote in an email obtained by BuzzFeed News. “The [proof of concept] may generate as many questions as answers.”

After the camera feeds were set up to work with Rekognition, an Amazon employee told the Orlando Police Department in a Feb. 14 email that the system worked unevenly. “Currently the full loop is working. Video is being ingested, and Rekognition is analyzing it,” the AWS employee wrote. “We have had positive hits, and some missed faces.”

Such issues are hardly uncommon for a real-time facial recognition system, according to Georgetown Law’s Garvie. “If you are taking video surveillance footage, and running facial recognition systems, you have uncooperative subjects — they’re not necessarily looking straight at the camera, their faces might be partially obscured, there could be multiple faces in a given frame, and faces might be blurred because they’re in motion,” she said. “Camera angles might be poor because generally, surveillance cameras are mounted above our heads.”

These variabilities can degrade the accuracy of facial recognition systems — and often do. Case in point: of the 170,000 people who arrived at Cardiff, South Wales for the 2017 Champions League soccer final, a police facial recognition experiment flagged 2,470 as potential criminal matches. But it got 92% of those results wrong; all in all, the experiment saw 2,297 false positives.

South Wales is an extreme case. But it’s an example of facial recognition’s nascency sending a public surveillance effort off the rails. It’s not easy to deploy; there are plenty of opportunities for things to go awry. Emails obtained by BuzzFeed News suggest Orlando police officials and Amazon Rekognition executives were not always clear about how, exactly, the Orlando pilot worked when it became public.

In May, police Chief John Mina asserted that the technology was only being used in the limited pilot inside police headquarters. “It’s all internal,” said Mina. “We’re not tracking citizens.”

“Amazon has a duty to ensure there are strong safeguards in place against misuse."

A day later, Mina walked back that statement, admitting there were another three Rekognition-enabled video streams surveying downtown Orlando. He added that the city was “a long way” from tracking real persons of interest in police investigations and that the Orlando Police Department “would never use this technology to track random citizens, immigrants, political activists, or certainly, people of color.”

To Georgetown Law’s Garvie, Mina’s pledge was “encouraging,” though without a written policy and regulatory framework, she said, it “means essentially nothing.”

“In my view, law-enforcement agencies should be careful not to deploy facial recognition technology without clear policies for how it will be used that are shaped by public input,” Sen. Ron Wyden, an Oregon Democrat, told BuzzFeed News. “For its part, Amazon has a duty to ensure there are strong safeguards in place against misuse. Facial recognition technology is too powerful to simply be sold to anyone and everyone, regardless of their knowledge or intent.”

During that same press conference, Mina was asked whether raw video feeds from the eight pilot cameras are sent to Amazon servers. “I’m not sure,” the chief said at the time. But through its FOIA request, BuzzFeed News obtained a chart that detailed the Orlando Police Department’s workflow process for Amazon Rekognition; it shows the raw video feed is sent to Amazon servers for processing.

The communication wasn’t much better on Amazon’s side. In May, Ranju Das, director of software development for Amazon Rekognition, described Orlando's facial recognition pilot to attendees of Amazon's AWS Summit in Seoul. “[Orlando has] cameras all over the city," he said. "We analyze the video in real time, search against the collection of faces that they have."

But this was not the case. A week later Das was apologizing to Orlando PD in an email obtained by BuzzFeed News.

“During a recent presentation, I misspoke about the City of Orlando’s use of AWS technologies,” Das wrote. “I was aware that the City of Orlando is testing Amazon Rekognition Video … internally to find ways to increase public safety and operational efficiency, but that it was not correct to state that they had installed cameras all over the city or are using these services in production. I sincerely apologize for any misunderstanding."

An Orlando public official emailed Amazon on June 20, noting, “We are seeing an increase in FOIA requests, many of which are from outside our region. We believe the increase is triggered by the CNN article [Amazon shareholders call for halt of facial recognition sales to police].” The city official asked for guidance from Amazon.

In an emailed response on June 21, Amazon pointed the Orlando representative to a blog post from Matt Wood, general manager of artificial intelligence at AWS, which contained critiques of the methods used by the American Civil Liberties Union — the nonprofit that first broke the story on Amazon pitching Rekognition to law enforcement agencies. “Please let me know if you want to get on a call to discuss,” the Amazon representative said. Amazon sent the same blog post to US Immigration and Customs Enforcement, another agency that it marketed its Rekognition tech to, according to documents first reported on by the Daily Beast.

“I am concerned that the city of Orlando appears to be continuing the pilot program using Amazon’s Rekognition facial profiling technology without transparently communicating to the public how they are implementing it,” Democratic Rep. Ro Khanna of California told BuzzFeed News in a statement. “I urge city officials to clarify how they are using this technology and ensure that people’s rights are not being violated through this program.”

Two days after BuzzFeed News sent the city of Orlando a detailed list of questions, the city set up a website with information about its facial recognition pilot. Among the downloadable attachments provided on the website was a new “statement of work” document for its second pilot, which included the city’s “AWS billing account ID” — a piece of information that should not be public, because attackers could possibly use it to gain access to the city’s systems. (BuzzFeed News flagged the item to representatives of the city of Orlando, but the city has yet to redact that information.)

There was a time when it was unclear whether Orlando would push for the continued testing of Amazon’s facial recognition tools. While the city began streaming its video feeds to Amazon Web Services back in February, it was only in May, when the ACLU reported that Amazon was pitching its facial recognition tool to law enforcement agencies, including Orlando’s, that the broad public became aware of the Orlando PD’s facial recognition pilot. Civil liberties groups promptly sounded the alarm on privacy and free speech concerns; for a while, it seemed like maybe police and city officials had been swayed. After about a month of mounting backlash, officials announced that the Orlando PD would end the Rekognition pilot program.

Then the city reversed course. A July 6 memo addressed to Orlando Mayor Buddy Dyer and representatives of Orlando Districts 1 through 6, and written by representatives of the Orlando Police Department, stated: “The pilot aligns with the City’s mission to be financially responsible by leveraging existing resources and technology to improve operational efficiencies supporting OPD in keeping our residents, visitors, and officers safe.”

Here’s a quick and dirty rundown of how Orlando's Rekognition system works, according to documentation obtained by BuzzFeed News: Orlando PD uploads photos of "persons of interest" (officer volunteers) into the system; Rekognition analyzes their faces, maps their features, and indexes them. Then, faces from the city's eight designated live video streams — four from cameras inside Orlando police headquarters, three IRIS cameras in downtown Orlando, and one at another unnamed city facility — are sent to Rekognition and compared against the Orlando-provided collection of faces. If the system detects a match between a pedestrian's face and that of a "person of interest" in its index, it is supposed to instantly send a notification to police officers.

For Orlando’s second pilot, according to the July memo, the number of cameras “may be increased” from eight cameras to an unspecified number, “to ensure this technology can function as designed with a larger volume.” The city said it retains video footage for 30 days, after which it is overwritten.

Amazon Web Services is not currently saving any of Orlando’s video streams, according to the memo. But, it also says, Orlando PD is authorized to change its mind on this, and let Amazon have access to the city’s video streams for up to seven days. Amazon did not respond on record to questions about what kind of access the company currently has to Orlando's video feeds. According to AWS documentation, clients' streams are encrypted and company engineers wouldn't typically have access to raw data. But the company’s policies also show that Amazon uses customer data to improve or validate features in Rekognition — so it is possible that engineers could have access to clients' de-identified metadata.

A “statement of work” document from the first pilot reviewed by BuzzFeed News showed that Amazon absorbed costs to the city, amounting to $39,000, for a “one time adoption incentive.” The first pilot started on Dec. 19, 2017, and lasted for six months. A second “statement of work” showed that Amazon provided Orlando with another “one time discount” worth $13,800 for a second pilot, which began last week, on Oct. 18, and is expected to last for nine months. Amazon told BuzzFeed News that providing customers with an opportunity to test technology like this for free is "common practice in the industry."

Orlando has high expectations for the system. If it works as planned, it could — according to the July memo — protect police officers (or in this example, performer Lana Del Rey) from harm.

The goals of the pilot are certainly commendable, and at first glance, the facial recognition system looks straightforward. That is, until you start digging into details like the possibility of surveillance of someone who is not a person of interest. Or looking for the sort of precautions and audits that might prevent a false positive arrest. Or for a process for citizens to appeal and get themselves removed from the person of interest list if they think police were mistaken.

Chief Mina has said Orlando was not tracking any citizens in its first-phase pilot, and the police department’s most recent memo on its second-phase pilot says the same — the technology tracks only the seven police officer volunteers. But in order to effectively flag the faces of volunteer "persons of interest," the system must also disregard the faces of persons who are not of interest. In other words, it analyzes them, too.

It is a problem, said Suresh Venkatasubramanian, a computer science professor at the University of Utah who teaches ethics in data science, that officers don’t appear to have been trained on any rules for entering images into Orlando’s person of interest database. “The way this is set up, any image can be uploaded,” Venkatasubramanian said. Police seem to have full discretion in adding whomever they consider to be suspicious.

According to Amazon’s documentation, Rekognition by default indexes a hundred of the largest faces in an image submitted to the “persons of interest” list, and indexing is automated — so if the Orlando police were to upload images that included bystanders, it's possible individuals who aren't considered persons of interest could be inadvertently added to the list. Once you’re on the list, there are currently no policies or procedures to appeal to be taken off.

“If police are only looking for violent criminals ... a reasonable officer is going to act with force. What if the system got it wrong?”

Sgt. Bernal said in an email to BuzzFeed News that Orlando police have not received hands-on training from Amazon on the standards or specifications for what kinds of photos can be uploaded to build the police department's "persons of interest" list.

The stakes would be especially high if Orlando’s system misidentified a suspect, said Georgetown Law's Garvie. “If, for example, police are only looking for violent criminals, [and the system] sends an alert out to the field that somebody is considered armed and dangerous, a reasonable officer is going to act with force against that suspect,” she said. “What if the system got it wrong? There needs to be checks in place against overreliance on a faulty tool.”

Sgt. Bernal confirmed there are no policies around misidentification and appeals. “If the Orlando Police Department decides to ultimately implement official use of the technology, city staff would explore procurement and develop a policy governing the technology,” he explained.

The concern is not just that officers won’t know what to do when faced with a false positive or false negative result. Untrained analysts can also have an impact on criminal sentencing. A Jacksonville, Florida, man is currently appealing an eight-year prison sentence on the grounds that no one in the Jacksonville Sheriff’s Office knew how facial recognition technology worked at the time it was used to identify him. The deposition of the crime analyst who identified him lends credence to that argument. She was unable to “definitively” explain how the technology’s star-rating system was used to determine the quality of a match. The Jacksonville Sheriff’s Office also does not have a policy governing the system’s use.

A single incident, but a troubling one. Poor understanding of how facial recognition systems work can not only have a direct impact on the trajectory of a person’s life; it can also magnify entrenched cultural biases.

Legislators worry about this too. “I am particularly concerned about facial recognition’s role in perpetuating, and in some cases amplifying, racial, gender, and age biases, as the data used to develop these technologies is often sourced disproportionately from one group,” Democratic Sen. Kamala Harris of California said in a statement to BuzzFeed News. “Government agencies employing facial recognition technology must be transparent about how they are using it and held accountable for the outcomes it produces.”

The pace at which the Orlando Police Department is moving on facial recognition technology while demonstrating a limited grasp of how it works is concerning, said Scott Maxwell, an Orlando resident and columnist at the Orlando Sentinel. “I’m someone who isn’t inherently troubled by video capture in public places where there’s neither a legal or logical expectation of privacy,” Maxwell told BuzzFeed News. “But I was troubled by the apparent lack of understanding Orlando officials had in their own pilot program.”

"It’s hard for citizens to have confidence in a pioneering new program when the leaders don’t seem to fully understand what the hell they’re pioneering."

And the city could push the boundaries of facial recognition and surveillance further still. In May, Orlando police reportedly finalized a deal to crowdsource neighborhood footage from residents of the city using a service called Ring, which uses both a doorbell camera and a smartphone app called Neighbors that provides residents with real-time safety information. The Orlando City Council and Ring, which was acquired by Amazon in April, entered into an agreement in which city police could ask to gain access the security footage of about 10,000 Ring users in Orlando. When there is an open case, police may ask Ring to put out a message to people in "a very specific geographic area," according to Orlando Police spokeswoman Michelle Guido. Anyone within that area would "individually have to give consent" before Ring shared that video with officers.

There is no automatic sharing of data between Ring and Rekognition right now. But Amazon would not comment on a commitment to keep the data generated by each of these companies completely separate.

For Orlando residents who have been blindsided by the city’s experimentation before, this is worrisome. “When Rekognition launched, the city clearly didn’t have its act together,” Maxwell said. “The police chief said one thing one day — and something else the next. It’s hard for citizens to have confidence in a pioneering new program when the leaders don’t seem to fully understand what the hell they’re pioneering.”

Technologists say Amazon should be taking a share of the responsibility too. “Jeff Bezos has been defending Amazon's position by appealing to the rhetoric of 'patriotism' and the cliché that [facial recognition] technology is neutral — that what matters is whether it’s used by good or bad actors,” Evan Selinger, a professor of philosophy at Rochester Institute of Technology, told BuzzFeed News.

“At best, this is a naive position. True patriots realize that simply following the law doesn't necessarily mean you're using technology ethically.” ●

CORRECTION

There were 170,000 people who arrived at Cardiff, South Wales, for the Champions League soccer final in 2017. An earlier version of this article stated these 170,000 attended the game.

UPDATE

This story has been updated with additional details about the Amazon-owned home security service Ring.



Topics in this article

Skip to footer