Same Problem, Different Day: Government Accountability Office Updates Its Review of FBI’s Use of Face Recognition—and It’s Still Terrible

By Jennifer Lynch

This week the federal Government Accountability Office (GAO) issued an update to its 2016 report on the FBI’s use of face recognition. The takeaway, which they also shared during a Congressional House Oversight Committee hearing: the FBI now has access to 641 million photos—including driver’s license and ID photos—but it still refuses to assess the accuracy of its systems.

According to the latest GAO Report, FBI’s Facial Analysis, Comparison, and Evaluation (FACE) Services unit not only has access to FBI’s Next Generation Identification (NGI) face recognition database of nearly 30 million civil and criminal mug shot photos, it also has access to the State Department’s Visa and Passport databases, the Defense Department’s biometric database, and the driver’s license databases of at least 21 states. Totaling 641 million images—an increase of 230 million images since GAO’s 2016 report—this is an unprecedented number of photographs, most of which are of Americans and foreigners who have committed no crimes.

The FBI Still Hasn’t Properly Tested the Accuracy of Its Internal or External Searches

Although GAO criticized FBI in 2016 for failing to conduct accuracy assessments of either its internal NGI database or the searches it conducts on its state and federal partners’ databases, the FBI has done little in the last three years to make sure that its search results are accurate, according to the new report. As of 2016, the FBI had conducted only very limited testing to assess the accuracy of NGI’s face recognition capabilities. These tests only assessed the ability of the system to detect a match—not whether that detection was accurate, and as GAO notes, “reporting a detection rate of 86 percent without reporting the accompanying false positive rate presents an incomplete view of the system’s accuracy.”

As we know from previous research, face recognition is notoriously inaccurate across the board and may also misidentify African Americans and ethnic minorities, young people, and women at higher rates than whites, older people, and men, respectively. By failing to assess the accuracy of its internal systems, GAO writes—and we agree—that the FBI is also failing to ensure it is “sufficiently protecting the privacy and civil liberties of U.S. citizens enrolled in the database.” This is especially concerning given that, according to the FBI, they’ve run a massive 152,500 searches between fiscal year 2017 and April 2019—since the original report came out.

The FBI also has not taken any steps to determine whether the face recognition systems of its external partners—states and other federal agencies—are sufficiently accurate to prevent innocent people from being identified as criminal suspects. These databases, which are accessible to the FACE services unit, are mostly made up of images taken for identification, certification, or other non-criminal purposes. Extending their use to FBI investigations exacerbates concerns of accuracy, not least of which because, as GAO notes, the “FBI’s accuracy requirements for criminal investigative purposes may be different than a state’s accuracy requirements for preventing driver’s license fraud.” The FBI claims that it has no authority to set or enforce accuracy standards outside the agency. GAO disagrees: because the FBI is using these outside databases as a component of its routine operations, it is responsible for ensuring the systems are accurate, and given the lack of testing, it is unclear “whether photos of innocent people are unnecessarily included as investigative leads.”

Many of the 641 million face images to which the FBI has access are through 21 states’ driver’s license databases. 10 more states are in negotiations to provide similar access.

As the report points out, most of the 641 million face images to which the FBI has access—like driver’s license and passport and visa photos—were never collected for criminal or national security purposes. And yet, under agreements and “Memorandums of Understanding” we’ve never seen between the FBI and its state and federal partners, the FBI may search these civil photos whenever it’s trying to find a suspect in a crime. As the map above shows, 10 more states are in negotiations with the FBI to provide similar access to their driver’s license databases.

Images from the states’ databases aren’t only available through external searches. The states have also been very involved in the development of the FBI’s own NGI database, which includes nearly 30 million of the 641 million face images accessible to the Bureau (we’ve written extensively about NGI in the past). As of 2016, NGI included more than 20 million civil and criminal images received directly from at least six states, including California, Louisiana, Michigan, New York, Texas, and Virginia. And it’s not a way one-way street: it appears that five additional states—Florida, Maryland, Maine, New Mexico, and Arkansas—could send their own search requests directly to the NGI database. As of December 2015, the FBI was working with eight more states to grant them access to NGI, and an additional 24 states were also interested.

New Report, Same Criticisms

The original GAO report heavily criticized the FBI for rolling out these massive face recognition capabilities without ever explaining the privacy implications of its actions to the public, and the current report reiterates those criticisms. Federal law and Department of Justice policies require the FBI to complete a Privacy Impact Assessment (PIA) of all programs that collect data on Americans, both at the beginning of development and any time there’s a significant change to the program. While the FBI produced a PIA in 2008, when it first started planning out the face recognition component of NGI, it didn’t update that PIA until late 2015—seven years later and well after it began making the changes. It also failed to produce a PIA for the FACE Services unit until May 2015—three years after FACE began supporting FBI with face recognition searches.

Federal law and regulations also require agencies to publish a “System of Records Notice” (SORN) in the Federal Register, which announces any new federal system designed to collect and use information on Americans. SORNs are important to inform the public of the existence of systems of records; the kinds of information maintained; the kinds of individuals on whom information is maintained; the purposes for which they are used; and how individuals can exercise their rights under the Privacy Act. Although agencies are required to do this before they start operating their systems, FBI failed to issue one until May 2016—five years after it started collecting personal information on Americans. As GAO noted, the whole point of PIAs and SORNs is to give the public notice of the privacy implications of data collection programs and to ensure that privacy protections are built into systems from the start. The FBI failed at this.

This latest GAO report couldn’t come at a more important time. There is a growing mountain of evidence that face recognition used by law enforcement is dangerously inaccurate, from our white paper, “Face Off,” to two Georgetown studies released just last month which show that law enforcement agencies in some cities are implementing real-time face recognition systems and others are using the systems on flawed data.

Two years ago, EFF testified before The Congressional House Oversight Committee on the subject, pointing out the FBI’s efforts to build up and link together these massive facial recognition databases that may be used to track innocent people as they go about their daily lives. The Congressional House Oversight Committee held two more hearings in the last month on the subject which saw bipartisan agreement over the need to rein in law enforcement’s use of this technology, and during which GAO pointed out many of the issues raised by this report. At least one more hearing is planned. As the Congressional House Oversight Committee continues to assess law enforcement use of face recognition databases, and as more and more cities are working to incorporate flawed and untested face recognition technology into their police and government-maintained cameras, we need all the information we can get on how law enforcement like the FBI are currently using face recognition and how they plan to use it in the future. Armed with that knowledge, we can push cities, states, and possibly even the federal government to pass moratoria or bans on the use of face recognition.


As Surveillance Litigation Director, Jennifer Lynch leads EFF’s legal work challenging government abuse of search and seizure technologies through the courts by filing lawsuits and amicus briefs in state and federal courts, including the U.S. Supreme Court, on important issues at the intersection of technology and privacy. Jennifer founded EFF’s Street Level Surveillance Project, which informs advocates, defense attorneys, and decisionmakers about new police tools, and in 2017, the First Amendment Coalition awarded her its Free Speech and Open Government Award for her work opening up public access to police surveillance records. Jennifer has written influential white papers on biometric data collection in immigrant communities and law enforcement use of face recognition. She speaks frequently at legal and technical conferences as well as to the general public on technologies like location tracking, biometrics, algorithmic decisionmaking, and AI, and has testified on facial recognition before committees in the Senate and House of Representatives. She is regularly consulted as an expert on these subjects and others by major and technical news media.

This article was sourced from EFF.org

Subscribe to Activist Post for truth, peace, and freedom news. Follow us on Minds and Twitter.

Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.


Activist Post Daily Newsletter

Subscription is FREE and CONFIDENTIAL
Free Report: How To Survive The Job Automation Apocalypse with subscription

Be the first to comment on "Same Problem, Different Day: Government Accountability Office Updates Its Review of FBI’s Use of Face Recognition—and It’s Still Terrible"

Leave a comment

Your email address will not be published.


*