The US Government Will Be Scanning Your Face At 20 Top Airports, Documents Show
“This is opening the door to an extraordinarily more intrusive and granular level of government control.”
In March 2017, President Trump issued an executive order expediting the deployment of biometric verification of the identities of all travelers crossing US borders. That mandate stipulates facial recognition identification for “100 percent of all international passengers,” including American citizens, in the top 20 US airports by 2021. Now, the United States Department of Homeland Security is rushing to get those systems up and running at airports across the country. But it's doing so in the absence of proper vetting, regulatory safeguards, and what some privacy advocates argue is in defiance of the law.
According to 346 pages of documents obtained by the nonprofit research organization Electronic Privacy Information Center — shared exclusively with BuzzFeed News and made public on Monday as part of Sunshine Week — US Customs and Border Protection is scrambling to implement this “biometric entry-exit system,” with the goal of using facial recognition technology on travelers aboard 16,300 flights per week — or more than 100 million passengers traveling on international flights out of the United States — in as little as two years, to meet Trump's accelerated timeline for a biometric system that had initially been signed into law by the Obama administration. This, despite questionable biometric confirmation rates and few, if any, legal guardrails.
These same documents state — explicitly — that there were no limits on how partnering airlines can use this facial recognition data. CBP did not answer specific questions about whether there are any guidelines for how other technology companies involved in processing the data can potentially also use it. It was only during a data privacy meeting last December that CBP made a sharp turn and limited participating companies from using this data. But it is unclear to what extent it has enforced this new rule. CBP did not explain what its current policies around data sharing of biometric information with participating companies and third-party firms are, but it did say that the agency “retains photos … for up to 14 days” of non-US citizens departing the country, for “evaluation of the technology” and “assurance of the accuracy of the algorithms” — which implies such photos might be used for further training of its facial matching AI.
“Government, without consulting the public is using facial recognition to create a digital ID of millions of Americans.”
“CBP is solving a security challenge by adding a convenience for travelers,” a spokesperson said in an emailed response to a detailed list of questions from BuzzFeed News. “By partnering with airports and airlines to provide a secure stand-alone system that works quickly and reliably, which they will integrate into their boarding process, CBP does not have to rebuild everything from the ground up as we drive innovation across the travel experience.”
The documents also suggest that CBP skipped portions of a critical “rulemaking process,” which requires the agency to solicit public feedback before adopting technology intended to be broadly used on civilians, something privacy advocates back up. This is worrisome because — beyond its privacy, surveillance, and free speech implications — facial recognition technology is currently troubled by issues of inaccuracy and bias. Last summer, the American Civil Liberties Union reported that Amazon’s facial recognition technology falsely matched 28 members of Congress with arrest mugshots. These false matches were disproportionately people of color.
“I think it’s important to note what the use of facial recognition [in airports] means for American citizens,” Jeramie Scott, director of EPIC’s Domestic Surveillance Project, told BuzzFeed News in an interview. “It means the government, without consulting the public, a requirement by Congress, or consent from any individual, is using facial recognition to create a digital ID of millions of Americans.”
“CBP took images from the State Department that were submitted to obtain a passport and decided to use them to track travelers in and out of the country,” Scott said.
“Facial recognition is becoming normalized as an infrastructure for checkpoint control,” said Jay Stanley, an American Civil Liberties Union senior policy analyst and a participant at meetings that CBP has organized with privacy advocates. “It's an extremely powerful surveillance technology that has the potential to do things never before done in human history. Yet the government is hurtling along a path towards its broad deployment — and in this case, a deployment that seems quite unjustified and unnecessary.”
“The government is hurtling along a path towards facial recognition's broad deployment — and in this case, a deployment that seems unjustified and unnecessary.”
In response, CBP intimated that there shouldn’t be any privacy concerns around its biometric facial recognition program. “CBP is committed to protecting the privacy of all travelers and has issued several Privacy Impact Assessments related to [its biometric entry-exit program], employed strong technical security safeguards, and has limited the amount of personally identifiable information used in the transaction,” the agency spokesperson said.
But this statement belies the far-reaching ambitions of the program, according to the documents reviewed by BuzzFeed News. CBP, the documents say, wants facial recognition at “initial operating capability” by year’s end, with the agency using it for as many as 30 international flights across more than a dozen US airports per day.
In the US, there are no laws governing the use of facial recognition. Courts have not ruled on whether it constitutes a search under the Fourth Amendment. There are no checks, no balances. Yet government agencies are working quickly to roll it out in every major airport in the country. It’s already being used in seventeen international airports, among them: Atlanta, New York City, Boston, San Jose, Chicago, and two airports in Houston. Many major airlines are on board with the idea — Delta, JetBlue, British Airways, Lufthansa, and American Airlines. Airport operations companies, including Los Angeles World Airports, Greater Orlando Aviation Authority, Mineta San Jose International Airport, Miami International Airport, and the Metropolitan Washington Airports Authority, are also involved.
“THE MOST OPERATIONALLY FEASIBLE AND TRAVELER-FRIENDLY OPTION”
“Airlines, airports, TSA, and CBP are facing fixed airport infrastructure with little opportunities for major investment, increased national security threats with pressures for solutions, and increased traveler volume,” CBP’s Concept of Operations document, released in June 2017, states. “Collectively, this is a status quo that is not sustainable for any of the main stakeholders, and failure to change will ultimately result in increases in dissatisfied customers, use of alternative modes of travel, and vulnerability to serious threats.”
In June 2016, CBP began its first pilot for facial recognition technology in airports at the Hartsfield–Jackson Atlanta International Airport. Once a day, for a flight from Atlanta to Tokyo, Japan, passengers’ passport photos were biometrically matched to real-time photographs. Before travelers proceeded to the passenger loading bridge to board their flight, CBP officers told passengers to scan their boarding passes, then a camera snapped a digital image of the traveler’s face; a CBP-developed back-end system called the Departure Information System used facial recognition to automatically compare photos during boarding against a photo gallery. Everyone between the ages of 14 and 79 was expected to participate.
“Failure to change will ultimately result in increases in dissatisfied customers, use of alternative modes of travel, and vulnerability to serious threats.”
The CBP’s stated goal here was simply to “identify any non-U.S. citizens subject to the exit requirements who may fraudulently present” travel documents. The agency said it had “no plans to biometrically record the departure of U.S. citizens.” But the CBP also said it “does not believe there is enough time to separate U.S. citizens from non-U.S. citizen visitors prior to boarding” … “therefore, facial images will be collected for U.S. citizens as part of this test so that CBP can verify the identity of a U.S. citizen boarding the air carrier.” CBP said that once a traveler is identified and confirmed as a U.S. citizen, their images are deleted.
Three months later, the agency switched to a daily flight from Atlanta to Mexico City. By the end of November 2016, CBP was running tests on an average of seven flights per week. From these tests, according to a DHS Office of Inspector General (OIG) audit of the government’s facial recognition biometrics program, published in September 2018, “CBP concluded that facial recognition technology … was the most operationally feasible and traveler-friendly option for a comprehensive biometric solution.”
In June 2017, CBP added three more international airport locations “to further assess facial matching technology as a viable solution,” according to the OIG report. Five more airports followed by October 2017. Today, 17 airports* are in the program, with three more in the works.
During its 2017 expansion, CBP’s Departure Information System was replaced by a more advanced automated matching system, called Traveler Verification Service (TVS). As CBP documents explained, TVS could “[operate] in a virtual, cloud-based infrastructure that can store images temporarily and operate using a wireless network.” Once a passenger boarded a plane, TVS also automatically transmits confirmation that there is a biometric match across other DHS systems.
CBP says it allows U.S. citizens to decline facial verification and to instead have their identities confirmed through the usual manual boarding process. “CBP works with airline and airport partners to incorporate notifications and processes into their current business models, including signage and gate announcements, to ensure transparency of the biometric process,” an agency spokesperson said in an email to BuzzFeed News. But of 12 flights observed by OIG during its audit in 2017, only 16 passengers declined to participate.
According to Delta, less than 2% of its weekly 25,000 passengers going through the Atlanta airport’s Terminal F, which features “curb to gate” facial recognition systems, opt out of using the tech.
CBP officers also have wide latitude for how to handle travelers whose faces are obscured for religious reasons. A previously unpublished document detailing the standard operating procedure for the TVS described how officers may deal with airplane passengers donning religious headwear. “For travelers with religious headwear that covers their face, officer discretion may be used consistent with CBP Policy,” it says.
There were also issues with matching. The OIG audit, which covered fieldwork by DHS from August to December 2017, a time that TVS was actively in use, found that CBP was able to provide biometric confirmation for only 85% of passengers processed. Its matches for certain age groups and nationalities were inconsistent; Mexican and Canadian citizens were particularly problematic. (It’s worth noting that the CBP’s Concept of Operations document includes some discussion of “a data exchange with Mexico and Canada.”)
“The low 85-percent biometric confirmation rate poses questions as to whether CBP will meet its milestone to confirm all foreign departures at the top 20 US airports by fiscal year 2021,” the audit said. Confirmation rates for CBP’s biometric exit system have since risen to 98.6%, according to an agency spokesperson.
OIG also found that CBP had “not previously established a metric for photo matching.” The way the TVS algorithm works, according to the OIG report, the threshold can be set to strict limits on what it considers a match but which would result in a lower verification percentage, or to a lower setting that would verify more people but also likely increase false positives.
“In theory, they could move the threshold down to zero — which would be a system that says, ‘Only Clare is allowed to board the plane, everybody is Clare, so everybody can board the plane,’” said Clare Garvie, an associate at Georgetown Law's Center on Privacy and Technology. “That’s a system that technically has a 100% match rate.”
“In theory, they could move the threshold down to zero — which would be a system that says, ‘Only Clare is allowed to board the plane, everybody is Clare, so everybody can board the plane.’”
Caryl Spoden, JetBlue’s head of customer experience, told BuzzFeed News that for boarding, the image gallery JetBlue uses to compare faces is made up of no more than 200 customers — the capacity of the airline’s A321 aircraft — making the matching process “very accurate.”
“CBP sets their match rate goal for in-scope travelers, which are those aged 14–79, to be greater than 97%,” Spoden said. “They set the goal for [false positives], which is when the system misidentifies a customer as another, to be less than or equal to 0.1%.”
“It sounds like CBP has finally set a false positive rate, which is something that hasn't been mentioned in the past,” Garvie told BuzzFeed News.
The government’s end vision, according to an early “Biometric Pathway” document from December 2016, is for CBP to build a vast “backend communication portal to support TSA, airport, and airline partners in their efforts to use facial images as a single biometric key for identifying and matching travelers to their identities.”
“This will enable ... verified biometrics for check-in, baggage drop, security checkpoints, lounge access, boarding, and other processes,” the document says. “This will create simplified and standardized wayfinding across airports.”
In other words: surveillance throughout the airport.
“FUTURE BUSINESS AGREEMENTS OR COMMERCIAL OPPORTUNITIES”
If you ask the CBP how it is storing and securing the biometric data it currently collects, the agency will tell you that faces of U.S. citizens are stored for “up to 12 hours after capture” and that airline and airport partners “must purge the photos, once they are transferred to CBP and must allow CBP to audit compliance.” But that’s a new rule that has only recently gone into effect. CBP did not respond to questions from BuzzFeed News on whether this rule has been applied retroactively.
“This is obviously a change from what CBP has said previously,” said EPIC’s Scott, adding that the first change in policy he was aware of was three months ago, mentioned during a December 2018 DHS meeting on data privacy and integrity.
“Through DHS’s Data Privacy and Integrity Advisory Committee (DPIAC) … discussions included review of current pilots, retention policies, future biometric vision, and alternative screening procedures,” the CBP spokesperson told BuzzFeed News. “CBP briefed the DPIAC in September 2017, in May 2018, and again in July 2018, where CBP provided a tour of biometric entry and exit operations at Orlando International Airport.”
But Scott told BuzzFeed News that DPIAC's review of the privacy issues raised by the use of facial recognition at airports is "no replacement for soliciting public comments as required by federal law when agencies make changes that substantively impact the public."
Previously, according to the first public CBP memorandum of understanding, signed sometime in fall 2017, “airline partners” had to find their own “technology partner” to provide the front-end equipment needed to capture a traveler’s photo, “in accordance with CBP’s specifications.” In exchange, the airline could seemingly use the data it gathered as it wished. “Nothing … preclude[s] any Party from entering into future business agreements or commercial opportunities,” the CBP’s memorandum of agreement states. In other words, the understanding appeared to be that there were no commercial limits for “airline partners”; if they wanted to sell or somehow monetize the biometric data they collect, there was nothing stopping them. (CBP did not respond to questions from BuzzFeed News asking for clarification of this document's phrasing.)
“Nothing … preclude[s] any Party from entering into future business agreements or commercial opportunities.”
“When people think about DHS, what they don’t necessarily think about is the power of the many billions of dollars in DHS contracts in what is now quite entrenched after almost 20 years,” Edward Hasbrouck, a consultant to the Identity Project civil liberties group, and a representative at meetings between CBP and advocates, said. The government's airport facial recognition program, he said, is just part of the creation of a new kind of “homeland security industrial complex.”
According to CBP, just like for its Atlanta pilot, photos of U.S. citizens are deleted when the system has successfully confirmed their identities. For noncitizens, however, photos taken upon arrival are stored in CBP’s system for 75 years.
Meanwhile, photos taken when noncitizens depart are saved for up to 14 days, according to a CBP spokesperson, for several reasons: confirming the travelers' identities, evaluating the technology and the accuracy of CBP's algorithms, and system audits. “The airlines are not permitted to use the photos for another purpose,” the spokesperson said.
Contacted by BuzzFeed News, Delta said it does not store or manage customer biometric information. The cameras that the airline uses are configured so they do not have the ability to save an image, the company said, and simply capture a traveler’s photo, which is then encrypted and transmitted in a “de-identified” format, and sent to CBP for verification. Then, according to Delta, CBP sends a confirmation that the customer can proceed.
Government documents show a more complicated process. One, called a “capability development plan,” dated February 2017, states that CBP intends to do “matching and storing in a secure cloud environment.” Currently, the agency said, it does not have this capability, but it notes it is “available in the commercial environment through open competition.” The CBP did not answer questions about which companies’ cloud services the agency and its third-party stakeholders use. But two tech giants, Microsoft and Amazon, currently have robust cloud services along with the “authority to operate” designation from high-level government agencies, like U.S. Immigration and Customs Enforcement (ICE).
Amazon told BuzzFeed News that as a policy, it does not discuss specific customers who have not agreed to be a public reference. Microsoft declined to respond to inquiries from BuzzFeed News about any work it may do with CBP.
Metropolitan Washington Airports Authority, a partner of CBP and an entity that oversees the operations of airports serving the U.S. capital, conveyed that stakeholders can choose to work with whomever they like. A spokesperson said MWAA’s biometric facial recognition technology, VeriScan, uses cloud services.
According to the Concept of Operations document, “By partnering with other stakeholders, CBP can facilitate a large-scale transformation of air travel that, by using biometrics, will make air travel more secure … providing increased certainty as to the identity of airline travelers at multiple points in the travel process” and “build additional integrity into the immigration system.” Biometric capture, CBP explained, would be “integrated” into the “systems and business processes” of other stakeholders, including private ones like airports and airlines.
The idea is for CBP to be able to scale up the effort considerably. “Instead of a program that is built and developed exclusively by CBP, and that benefits only CBP missions,” the document states, “the result is a series of interconnected initiatives undertaken by multiple stakeholders, both public and private, and through which all will significantly benefit.”
Pam Dixon, executive director of the World Privacy Forum, said such an arrangement is tantamount to "a mandatory situation where we’re giving the airlines our biometric data — and other commercial partners — and we don’t even know who they are."
"If the U.S. government wants to run this program, the U.S. government should take the pictures, and only they should have access to the photos.”
“WHEN DISCUSSING DATA COLLECTION, PLEASE BE SURE TO EMPHASIZE …”
CBP’s Public Affairs Guidance for a facial recognition pilot in Atlanta, which has not been previously made public, details the agency’s talking points and press strategy on its use of facial recognition tech, especially around privacy concerns. “Extreme care should be taken when communicating with external audiences. … It is important to minimize any chance for CBP’s communications to be distorted into appearing unnecessarily invasive or broad,” the document reads. “When discussing data collection, please be sure to emphasize that this test is authorized by existing laws.”
Privacy advocates dispute this. According to the Identity Project's Hasbrouck, the government’s program was executed without following the law at all. “DHS doesn’t come out in court and say, ‘We’re doing this right, and here’s why,’” he said. “Their response is: ‘The courts shouldn’t even be looking at us. We should be allowed unfettered, secretive discretion to do this with administrative fiat.’”
“We were active participants in hearing them tell us what they were going to do.”
Other privacy and civil rights activists — representing EPIC, World Privacy Forum, and the ACLU — backed this up, saying their perspectives were not considered by CBP when the biometrics program was being developed and trialled, as government documents stipulate they should have been.
“[We were not] active participants in the development of CBP's program,” said the ACLU’s Stanley. “We were active participants in hearing them tell us what they were going to do.”
“They wanted to get the Electronic Frontier Foundation and ACLU stamp of approval,” said Hasbrouck. “What they wanted to say was, ‘How can we do this inherently invasive thing in ways that aren’t that invasive?’ But most people, at least at the meeting I was at, said, ‘No, you shouldn’t be doing this at all.’” Hasbrouck added that “any pretense or claim” by the CBP that there is still ongoing involvement from privacy advocates is “completely unfounded.”
CBP held two meetings with privacy advocates: one in Washington, DC, in August 2017, and another in San Francisco five months later in January 2018. “They didn’t have a review of their pilots with the public,” said Dixon, “and they didn’t reach out to us [after that] for further discussion.”
“The biggest thing they’re testing is how much legal resistance there will be.”
In response, an agency spokesperson said, “CBP has worked with the DHS Data Privacy Integrity Advisory Committee and sought input from several privacy groups regarding the deployment of entry-exit operations. Overall, no new data or additional information is collected on U.S. citizens under this program.”
Nowadays, all CBP is “testing” is “how to structure the program to make it technically work, and what tweaks the agency might need to make to appease, or suppress, or frustrate protests and legal challenges,” said Hasbrouck.
“But the biggest thing they’re testing is how much legal resistance there will be — whether that’s people saying ‘no’ [to their faces being captured at the airport], or challenging it in court.”
This is not the first time DHS has seemingly overstepped its boundaries. In the mid-2000s, EPIC sued to obtain records, describing problems with the TSA’s airport body scanners: invasive screening practices, potential health risks, traveler complaints, and more. Then in 2011, EPIC sued again, asking the courts to compel DHS to undertake a public notice-and-comment rulemaking on the use of body scanners. As EPIC argued, "The TSA has acted outside of its regulatory authority and with profound disregard for the statutory and constitutional rights of air travelers." The DC Circuit agreed, and for the first time, the public was allowed to comment on the body scanner program.
But this time, DHS appears to be arguing, a facial recognition program at the border is so critical that it should be implemented, even without going through all the steps of the rulemaking process. Three internal documents seen by BuzzFeed News state, “CBP will transform the way it identifies travelers by shifting the key to unlocking a traveler’s record from biographic identifiers to biometric ones — primarily a traveler’s face.”
Proponents of facial recognition praise the technology for improving the protection of the United States against external threats. Since the implementation of facial biometrics in the arrival process of U.S. airports, according to an agency spokesperson, CBP officers have “successfully intercepted a total of five imposters who were denied admission to the United States.” And CBP didn’t just catch impostors coming to the U.S. through airports. “Following implementation of the facial biometric technology demonstrations at the land border in the fall of 2018, CBP has identified 64 imposters who attempted to enter the United States by presenting genuine travel documents not legitimately issued to them,” the spokesperson said.
“This is opening the door to an extraordinarily more intrusive and granular level of government control.”
Still, concerns about just how quickly facial recognition technology has been adopted, particularly in airports, have come up in Congress. In May, two U.S. senators sent a letter to DHS, urging that formal rules be put into place before the airport biometrics program is expanded. The senators, Democratic Sen. Ed Markey and Republican Sen. Mike Lee, were united in this bipartisan opinion, even in one of the most highly polarized political climates in recent memory.
“[This] will … ensure a full vetting of this potentially sweeping program that could impact every American leaving this country by airport,” the two senators wrote.
Currently, no legislation has been introduced to curb what EPIC’s Scott calls the “mission creep” of facial recognition into airports.
For Hasbrouck, the big takeaway is that the broad surveillance of people in airports amounts to a kind of "individualized control of citizenry" — not unlike what’s already happening with the social credit scoring system in China. "There are already people who aren’t allowed on, say, a high-speed train because their social credit scores are too low," he said, pointing out that China’s program is significantly based in "identifying individual people and tracking their movements in public spaces though automated facial recognition."
“This is opening the door to an extraordinarily more intrusive and granular level of government control, starting with where we can go and our ability to move freely about the country,” Hasbrouck said. “And then potentially, once the system is proved out in that way, it can literally extend to a vast number of controls in other parts of our lives.” ●
* As of the time of publication, the airports included in CBP’s biometric facial recognition program are in Atlanta, Chicago, Seattle, San Francisco, Las Vegas, Los Angeles, Washington (Dulles and Reagan), Boston, Fort Lauderdale, Houston Hobby, Dallas/Fort Worth, JFK, Miami, San Jose, Orlando, and Detroit.
US Department of Homeland Security and US Customs and Border Protection documents referenced (in chronological order of document dates):
Biometric Pathway: Transforming Air Travel, December 2016
Capability Analysis Study Plan for Biometric Entry-Exit, January 2017
Biometric Entry-Exit Program Mission Needs Statement, February 2017
Biometric Entry-Exit Program Capability Development Plan, February 2017
Technical Match Rates Over Time, September 2017