Skip to Content
Artificial intelligence

What happens in Vegas… is captured on camera

Police departments around the US use face recognition in highly varied ways, with little consensus on best practices.

August 12, 2020

The use of facial recognition by police has come under a lot of scrutiny. In part three of our four-part series on facial recognition, host Jennifer Strong takes you to Sin City, which actually has one of America’s most buttoned-up policies on when cops can capture your likeness. She also finds out why celebrities like Woody Harrelson are playing a starring role in conversations about this technology. 

We meet: 

  • Albert Fox Cahn, Surveillance Technology Oversight Project's (S.T.O.P.'s) founder
  • Phil Mayor, ACLU Michigan senior staff attorney 
  • Captain Dori Koren, Las Vegas Police 
  • Assistant Chief Armando Aguilar, Miami Police 

Credits: 

This episode was reported and produced by Jennifer Strong, Tate Ryan-Mosley and Emma Cillekens. We had help from Benji Rosen and Karen Hao. We’re edited by Michael Reilly and Gideon Lichfield. Our technical director is Jacob Gorski.

Full episode transcript:

Jennifer Strong: Alright so we're out taking a drive on one of New York City's busiest bridges, it's called the RFK. And the reason we're doing this is because the transit authority, the MTA has installed live facial recognition. Actually, we're about to go under it right now. Do you see that the camera's pointed right at our faces? They've now put this all over the city. It's in a number of bridges and tunnels, but the reason we're on this one is because this is where it all started and what it does at least in theory, what it's supposed to do is read our faces through our windshields. And what's crazy about this is, well, what's crazy about it is that nobody knew there was a test, but what's also crazy about it is when the results of that test were leaked to the press last year. You want to guess how many faces were captured during the test period? What do you think? This is Emma Cillekens by the way... my producer.

Emma Cillekens: I don't know like thousands, maybe, maybe millions? This is New York City, maybe millions.

Jennifer Strong: No. Um, they got none. Zero percent. But they moved forward and they proceeded with it anyway.

Andrew Cuomo: The cashless tolling, which is really not about tolling. It's really about putting in an electronic system that we have never had before.

Jennifer Strong: That's New York Governor Andrew Cuomo speaking at an event in 2018.

Andrew Cuomo: What it is doing is it's put in a system that reads every license plate that comes through that crossing and reports to the police car that is stationed right there within five seconds.

Jennifer Strong: But this is not some tech event, nor is it about policing. It's celebrating the end of repairs to a tunnel, connecting Brooklyn and Manhattan. You see, the city subways and tunnels flooded with saltwater in the aftermath of Hurricane Sandy. All the electronics and wiring had to be replaced and it gave them an opportunity to try something new.

Andrew Cuomo: We're now moving to facial recognition technology, which takes you to a whole new level where it can see the face of the person in the car and run that against databases.

Jennifer Strong: And they're experimenting with something even more cutting edge than reading faces or license plates. They're attempting to read peoples' ears.

Andrew Cuomo: Because many times a person will turn their head when they see a security camera. So they're now experimenting with technology that just identifies a person by the ear believe it or not.

Jennifer Strong: I'm Jennifer Strong and this is part three of our series on police and facial recognition. This episode, we take a closer look at how the technology is being used in different cities and meet some police chiefs helping to make those decisions. 

[Show Introduction]

Jennifer Strong: Roads are not the only place New York City's transit authority may be experimenting with face ID, but without rules for how it gets used we tend to find out about it after the fact and often by accident, like when a New York Times reporter called out on Twitter something she'd seen in the subway.

Albert Fox Cahn: So, this was a monitor set up in the Time Square subway station that had these yellow boxes around peoples' faces indicating that it was using a minimum facial identification technology. If not facial.

Jennifer Strong: Albert Fox Cahn is the founder of the Surveillance Technology Oversight Project. Otherwise known as STOP. He reached out to the transit agency, the MTA, after seeing that tweet, but he says they didn't give much of a response.

Albert Fox Cahn: So, we sued them. And recently we got a favorable decision from a New York state judge who said that the MTA wrongly withheld information about those monitors without providing us any explanation of how they were being used, whether there was facial recognition involved and what the purpose of setting them up in the first place was because the MTA's justification for using these monitors was "don't worry." These aren't facial recognition. We just wanted to scare the public into thinking we had facial recognition so that they wouldn't skip their fare.

Jennifer Strong: But scare tactics aren't the only ones he's concerned about. He's been arguing with the NYPD about how they use this technology for a number of years.

Albert Fox Cahn: You have this incredibly powerful technology, but it's already prone to certain types of errors and to certain types of bias. And the NYPD is going through and using in a way that only makes that risk more pronounced. You have them feeding in doppelgänger photos and we have no idea how many times this has been done.

Jennifer Strong: There's even a name for this called the celebrity comparison. A few years ago, investigators were searching for a suspect caught on tape, stealing beer from a drug store in New York City. The camera captured his face, but not well enough for the facial recognition system to return any matches. A detective noticed the suspect looked a little bit like the actor, Woody Harrelson, and after running pictures through the system, they found a match. We don't know how often this happens, but we know it's more than once. In another example from the NYPD, a New York Knicks player stands in for the suspect.

Albert Fox Cahn: And so using a celebrity's photo to me, that's, it's very problematic that you're appropriating someone's image for this sort of purpose. Also, there's no evidence that this is accurate.

Jennifer Strong: Another practice he finds troubling, police manually Photoshop images. You see face recognition often fails to recognize faces with closed eyes. So some police departments will paste open eyes onto those photos to get results. They also alter their photos in other ways.

Albert Fox Cahn: If the mouth is open, you'll Photoshop them closed. If part of the face is obscured you'll see cases where they go on Google images and copy and paste like a part of another image to try to create a composite that the facial recognition algorithm will identify as a viable human face. Because even though these tools can be incredibly powerful, they are also so fragile and even having one eye closed can be enough to make it so that a facial recognition algorithm doesn't actually see a human face that it can match.

Jennifer Strong: What's more, we don't really even know what tools police have in some places, including New York.

Albert Fox Cahn: But even our elected officials, our city council don't know what tools they're using. And it's a threat to democracy itself when you have the police operating without any oversight, but it also makes it impossible to have the sort of public pushback we need to start banning these tools when we don't know that they're being used in the first place.

Jennifer Strong: But this is changing. This summer, the city passed the Public Oversight of Surveillance Technology act or the POST Act. It requires police to disclose basic information about what surveillance technologies they have, how they work, how they're used and how often it covers tech ranging from cell phone location, trackers, automated license plate readers, body cameras, and social media monitoring software. And of course facial recognition, but it's not a new bill. It's been sitting around for the last three years.

Albert Fox Cahn: None of the sweeping reforms we've seen in New York would have been possible without the incredible protests and uprisings we've seen around the city, around the country. It's fundamentally changed the debate over policing in this city.

Jennifer Strong: The ACLU is another player working on the same puzzle. It's a nonprofit focused on individual rights and liberties. They're outspoken about the need for regulation of face ID, especially in the case of Robert Williams, the man we met in episode one.

Robert Williams: I didn't know that they use any type of facial recognition or anything like that until talking with the detectives who showed me that, that that's what they used to apprehend me.

Melissa Williams: We didn't think this could happen. We didn't think it was a thing. Even with me following the facial recognition news, and it'd be how it was being used. I didn't ever expect the police to show up at our doorstep and arrest my husband. So I just feel like other people should know that it can happen and it did happen. And it shouldn't happen.

Jennifer Strong: He and his wife, Melissa, who you just heard, are represented by Phil Mayor, a senior staff attorney at the ACLU of Michigan.

Phil Mayor: Mr. and Mrs. Williams' defense attorney talks about how this is the first time in her years of representing criminal defendants that they've actually learned that her client was identified through facial recognition technology. And again, that didn't come out in court. It came out because the police accidentally said something. So I, I just am as confident as I can be that there are people who are in jail today convicted of crimes that they took plea bargains to all because a computer made the same kind of mistake it did in Mr. Williams's case.

Jennifer Strong: But real harm can still be done by a false match going public, even when nobody gets arrested.

Amara Majeed: On the morning of April 25th in the midst of finals season, I woke up in my dorm room to 35 missed calls all frantically informing me that I had been falsely identified as one of the terrorists involved in the recent Easter attacks on my beloved motherland, Sri Lanka,

Jennifer Strong: Amara Majeed was 22 years old and a senior at Brown University when her face appeared on a poster with the name of a different woman accused of these attacks. She'd been falsely identified by an algorithm.

Amara Majeed: There are no words to describe the pain of being associated with such heinous attacks on my own native homeland and people. The pictures and posts falsely implicating me have compromised my family's peace of mind and endangered our extended families' lives.

Jennifer Strong: Sri Lankan authorities later apologized for the mistake, but not before she was harassed, trolled and threatened on social media. It all boils down to this. Policy, transparency and oversight of facial ID differs radically from one place to the next. And even when there are rules in place to prevent these types of things from happening, they're only as good as they are followed. When we come back, we'll meet some police departments working on that. 

[Ad]

Jennifer Strong: The NYPD released its face ID policy back in March after nearly a decade of public pressure, but other police departments have been much more willing to engage the public on this from the start, including in the land of casinos, Las Vegas, where visitors and those who live there are no strangers to being surveilled because of the huge sums of money that pass through those gaming room floors. Police Captain Dori Koren is the commanding officer who oversees the Las Vegas Strip. And he tells me about a high-tech surveillance room that sounds like it could be featured on one of those cop shows. Because of coronavirus I can't visit the command center in person. So, he describes it for me.

Dori Koren: It is a little bit Hollywoodish and we did that on purpose in terms of how we wanted it to feel. So imagine walking into a large room and on the front wall, you have this massive display, all kinds of camera feeds like a surveillance room, but a little bit more high tech, little bit bigger, a little bit more advanced.

Jennifer Strong: Everything in this room is connected. Alerts playing when audio sensors detect a shooting, plus lots of other things officers might be interested in knowing about and all of these identity technologies show up on a big gridded map. So people and events can be placed at a specific location in near real time. And it kind of speaks to the city as a whole. Las Vegas is one of the country's pioneering smart cities, with sensors embedded just about everywhere, all the way down to trash cans that can smell what's inside. And captain Koren says face ID has revolutionized policing. Instead of just responding to crimes, he says they can get in front of them.

Dori Koren: Facial recognition for us has proven absolutely instrumental, instrumental in saving lives, instrumental in preventing violent crime.

Jennifer Strong: He says it allows officers to identify a pattern of criminal activity in real time.

Dori Koren: Figure out that there's a second robbery that just happened at a convenience store and perhaps the description of the suspect who committed the robbery with a firearm has the same color shirt as the robbery that just happened 10 minutes ago. Or the same vehicle. Or the same behavior. And they try to pick up on these patterns and then they send that out in real time so that way it could prevent the third robbery or fourth robbery.

Jennifer Strong: And they're happy to talk about it.

Dori Koren: We are firm believers in being transparent. I mean, we, at the end of the day, police serve the community that they come from. So we want to make sure the innovative and advanced things that we are doing, particularly when it comes to deploying technologies for fighting crime is accepted by the community.

Jennifer Strong: This kind of openness is rare and not just with law enforcement. Private companies use facial recognition too, often without telling anyone. A recent example, face ID cameras were quietly installed at hundreds of Rite Aid pharmacies, largely in lower income and nonwhite parts of Los Angeles and New York, according to Reuters. And casinos have been using some sort of facial identification for decades for fraud prevention, enhanced security, and even to recognize gamblers at the tables.

Dori Koren: The private sector has some of the more advanced on edge facial recognition platforms. These are platforms that are recording and analyzing people's faces in real time as they cross the camera view. You don't want to leave just the private companies, the general public, which also will include all your criminals to have the best and most advanced technologies and to leave the law enforcement with archaic tools that aren't able to do the job in the 21st century.

Jennifer Strong: But that doesn't mean he thinks police should have free reign with all types of the technology.

Dori Koren: We do not use facial recognition on edge, which basically means live recording of the footage on that camera to scan everybody's face that comes across that camera angle. We don't do that. I don't know if law enforcement's ready to use that and certainly I don't think that law enforcement should, unless their community supports the use of that type of very advanced technology.

Jennifer Strong: They also don't search social media photos for suspects. They only run searches on someone who has committed a crime, and if they get a match, they limit its value. Meaning you can't just go arrest that person based on that match. But learning all of this meant I was in for a surprise when I asked, do you ever alter the photos so that they work better?

Dori Koren: That aspect of altering the photos has gotten so much negative attention as if the police are doing something wrong. And I don't think that the public generally understands why you would alter a photo. So the answer to your question is, yes, we do alter photos as part of that facial recognition exam. If anything, we should be arguing for that, not against it.

Jennifer Strong: He says, in some cases, if they didn't do that, their accuracy would actually go down.

Dori Koren: By changing the photo, by making it a little lighter, a little darker or changing the angle, it gives us the best chance to be able to confirm the results, to be able to test the algorithm. We don't want to go and pursue the wrong lead of the wrong individual. And so whatever we have to do to modify the photo, to be able to get the most accurate results and like I said, for us, we use a variety of other checks and balances to then determine that that is truly the likely candidate.

Jennifer Strong: Koren did go on to say they don't merge other people's faces with the input photo, so no pasting on open eyes for example, but it begs the question, how exactly are we measuring accuracy? And who gets to decide when and how this practice of modifying photos is fair play?

Dori Koren: Safety is not just physical safety, but it's also doing so with a balance to privacy, civil rights and civil liberties. And I think that there's a right balance. There's a way to do that. But the conversation has to be open and people have to be open on both sides to have that conversation. So the future for policing can become much better with these technologies. So as long as we do it, right.

Jennifer Strong: It cannot be overstated just how much things like police department policies, tools, community relationships, and public oversight vary from place to place. So this balance of civil liberties and policing has to be struck, not just in Las Vegas, but in every town in the country. Still, we spoke to other departments and there are some common themes. Armando Aguilar is Miami's assistant chief of police and he oversees the criminal investigations division.

Armando Aguilar: This is an invaluable tool for law enforcement, but it's certainly very dangerous. And if you think about it, many of the other tools that law enforcement officers have in the wrong hands are also very dangerous. And so we wanted to make sure that we not only use the technology, but that we use the technology responsibly and that we did so in a way that the people that we served were comfortable with.

Jennifer Strong: They've been using facial recognition in some form since 2013.

Armando Aguilar: We started out and continue to use a program called FACES, which is operated by the Pinellas County Sheriff's Office. It's one of the counties in Florida and it's a shared database with all the counties throughout the state. And so we moved in late 2019 to a product called Clearview AI.

Jennifer Strong: FACES runs off a government database of photos like Las Vegas and Clearview runs off of billions of photos, scraped from the web.

Armando Aguilar: Technology moves faster than policy. There was a time when the technology was in use here without a guiding policy. And so once, um, once I stepped into my current role, I thought it was, it was very important to, to set those parameters and, and just make sure that we were responsible in our use of the technology.

Jennifer Strong: Coronavirus, threw them a curveball and like everything else there, town halls went virtual, such as this one on Facebook Live.

Town Hall Voice: What data does Miami PD send or gets tracked by Clearview.?

Jennifer Strong: In crafting their policy, Aguilar says he also met with the ACLU.

Armando Aguilar: They respectfully started the meeting by saying that there's nothing that we could do that would satisfy them other than saying, we're going to scrap the program altogether. But knowing that likely that that wouldn't happen, they had about seven or eight very valid concerns that they brought to our attention. And we actually agreed with them and we incorporated each one of those into, into our policy.

Jennifer Strong: They also put limits on who can use it.

Armando Aguilar: So what we did also was in order to make sure that we would limit the opportunities for abuse, we limited the number of users. So anybody that needs to have somebody's picture run through Clearview or through FACES needs to send a request to our real-time crime center.

Jennifer Strong: He made sure that all trial Clearview accounts offered to police in Miami were canceled. And he shrunk the number of people who have access.

Armando Aguilar: It's a lot easier to control when you have a dozen people working in the same unit versus people that are spread out throughout the entire 1,800 member police department using the program.

Jennifer Strong: With these new guidelines, even if a photo finds a match, it doesn't give probable cause to go make an arrest.

Armando Aguilar: So now we have this match, but can we put that person in a photographic lineup and also have an eye witness identify them? Is there fingerprint evidence? Is there DNA evidence? Can the investigating detective or officer make an identification himself after, after examining the video and after examining the, the photograph, uh, that was, that was uploaded into the system. So it's not just automatically, "Hey, this person came up as a 99% match. Let's go get them." Just because somebody calls in a tip and says, Hey, I think that's my neighbor, Bobby, we don't go on arrest Bobby.

Jennifer Strong: And once again, I wanted to know if they ever manipulate their photos before they run them through the software to up the odds of getting a match.

Armando Aguilar: No, we, we do not in any way manipulate our probe photograph, which is the, again, the photograph that we upload into, into the system, we, we use whatever's available and we either come up with a match or we don't.

Jennifer Strong: They also don't sub in celebrities to help find a suspect nor monitor people in real time. And they also thought about how face ID might infringe on constitutional rights.

Armando Aguilar: Maybe if one day we have protests outside of the police station, which became very real in the last few months, we don't want to use this technology to just go identifying protesters or protest organizers. People want to live life out loud and post every, every waking moment of their life on social media, uh, and also ask for privacy. And so certainly many of us do set our personal social media accounts to private settings. And so I do want to make sure that everybody is aware that this system in no way breaks into your private settings. So this only searches the internet and social media pages for those images that are available for anyone to see. And so anything beyond that would, would be a clear violation of, uh, of people's Fourth Amendment rights. And, and that's something that, uh, we neither have the capability of doing, uh, nor the interest of doing.

Jennifer Strong: So, what he's saying is we're responsible for protecting our own privacy on the web. And we do have the ability to turn Facebook settings to private, but using default settings on social media, isn't quite the same thing as walking into a police department and providing a bunch of photos that give away lots of personal information. And this is part of what makes consent online really thorny.

Jennifer Strong: Next episode, we meet the founder of the Russian company behind what may be the world's largest real-time face ID system, which aims to not only work on people wearing masks, but also tell if they're worn properly and read your car's license plate at the same time.

Jennifer Strong: Do you ever worry that somebody might take all of your hard work and use it to build a world you don't really want to live in? Join us as we wrap up this mini series by exploring the way forward and examining what policy might look like.

Jennifer Strong: This episode was reported and produced by me, Tate, Ryan-Mosley and Emma Cillekens. We had help from Benji Rosen and Karen Hao. We're edited by Michael Reilly and Gideon Lichfield. Our technical director is Jacob Gorski. Thanks for listening. I'm Jennifer Strong.

Deep Dive

Artificial intelligence

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.