As It Searches for Suspects, the FBI May Be Looking at You
Questions about accuracy and transparency plague the bureau’s five-year-old face matching system.
The FBI has access to nearly 412 million photos in its facial recognition system—perhaps including the one on your driver’s license. But according to a new government watchdog report, the bureau doesn’t know how error-prone the system is, or whether it enhances or hinders investigations.
Since 2011, the bureau has quietly been using this system to compare new images, such as those taken from surveillance cameras, against a large set of photos to look for a match. That set of existing images is not limited to the FBI’s own database, which includes some 30 million photos. The bureau also has access to face recognition systems used by law enforcement agencies in 16 different states, and it can tap into databases from the Department of State and the Department of Defense. And it is in negotiations with 18 other states to be able to search their databases, too.
The size of the total pool of photos the bureau can access, which was not clear until the new report from the Government Accountability Office, is shocking even to those who have been paying close attention to the FBI’s growing use of biometric data, says Jennifer Lynch, senior staff attorney at the Electronic Frontier Foundation. And the degree to which the FBI has access to photos in state-owned face image databases, which contain mostly driver’s license images, has Lynch and other privacy advocates concerned.
Deploying face recognition is the “logical next step” in the FBI’s use of biometrics, says Anil Jain, a professor of computer science and engineering and head of the biometrics research group at Michigan State University. The bureau already had an automatic fingerprint matching system, and adding face images to that data will lead to more reliable identification, he says. Surveillance cameras are everywhere, and facial recognition technology has improved to a point where “it makes sense to collect this additional data,” says Jain, who works closely with the Michigan state police.
Jain points out that the FBI’s fingerprint database also contains images taken for noncriminal purposes, like employee background checks. But photos of faces are different because they can be captured covertly, argues Alvaro Bedoya, executive director of the Center on Privacy and Technology at Georgetown Law. “I know what I touch, and I certainly know if I give fingerprints for a background check,” he says. “I don’t think there’s anyone who keeps track of every surveillance or smartphone camera.”
Adding to the privacy concerns is another finding in the GAO report: that the FBI has not properly determined how often its system makes errors and has not “taken steps to determine whether face recognition systems used by external partners, such as states and federal agencies, are sufficiently accurate” to support investigations. By taking those steps, the bureau “could better ensure the data received from external partners is sufficiently accurate and do not unnecessarily include photos of innocent people as investigative leads,” the report concludes.
According to the GAO, the FBI said it does not have the authority to audit the external systems. There is no evidence that the FBI’s face recognition system has ever mistakenly implicated someone in a crime.
State-of-the-art face recognition systems are generally very accurate, says Jain. But poor image quality—in either the images being tested or those in the database—can significantly hinder accuracy. If a lot of time has gone by since an image in the database was taken, that could also make errors more likely. And the larger the number of images in the database, the greater the chance of such errors—either incorrect matches or failure to match photos of people already in the database. This is why it’s so important that human biometrics experts make the final determination as to whether a match is correct, he says.
Become an Insider to get the story behind the story — and before anyone else.