Skip to Content

Land of a billion faces

Clearview AI probably has pictures of your face in its database. And its software—which police departments use—can almost certainly identify you.

August 12, 2020
A distorted image of surveillance cameras on a wall

Clearview AI has built one of the most comprehensive databases of people’s faces in the world. Your picture is probably in there (our host Jennifer Strong’s was). In part two of this four-part series on facial recognition, we meet the CEO of the controversial company who tells us our future is filled with FaceID— regardless of whether it's regulated or not.

We meet: 

  • Hoan Ton-That, Clearview AI 
  • Alexa Daniels-Shpall, Police Executive Research Forum 

Credits:

This episode was reported and produced by Jennifer Strong, with Tate Ryan-Mosley and Emma Cillekens, with special thanks to Karen Hao and Benji Rosen. We’re edited by Michael Reilly and Gideon Lichfield. Our technical director is Jacob Gorski.

Full episode transcript:

Jennifer Strong: There’s a fight playing out in the courts between the networking site LinkedIn, and a company called hiQ Labs—a startup that tells corporations when their employees are at risk of being poached by other companies… 

The problem is, it does this by pulling data off of LinkedIn’s website. 

News anchor reading: with more than 500 million users worldwide LinkedIn is a treasure trove of personal information. But what if that information you don’t want to share is getting back to your boss? 

Jennifer Strong: But hiQ argues, that’s ok—all this data is publicly available without a login. The case may go to the supreme court this year, though so far, the legal system agrees with hiQ.

Do you know what your rights to privacy are on websites like LinkedIn or YouTube? 

Would it be surprising that photos—including some you’ve never seen but somehow wound up on the web—are being used by companies to grow their businesses, including to build things like AI systems that identify suspects for police?  

I’m Jennifer Strong and in part two of our series on face recognition and policing we speak with chief executive Hoan Ton-That, the founder of one of the world’s most controversial tech companies, Clearview AI.

Jennifer Strong: Back in 2011 Google's CEO at the time, Eric Schmidt, gave a keynote interview at a conference hosted by The Wall Street Journal. 

Eric Schmidt: I’m very concerned personally about the union of mobile tracking and face recognition. 

Jennifer Strong: Combining face I-D with the tracking data off cell phones could give away almost every detail about how and where we spend our time. 

Schmidt said he believed it could be used for good or evil, but in democracies he thought it would be regulated quickly. 

He was on stage with journalists Walt Mossberg and Kara Swisher and they pressed him on it: What capabilities did Google have, and what might happen in the wrong hands?

Eric Schmidt: To, to be clear, we built that technology… and we withheld it. As far as I know it’s the only technology that Google built and after looking at it, we decided to stop.

Jennifer Strong: Fast forward nearly a decade. Facial recognition is still not regulated, and big tech is back to questioning what should be built, and who should have it.

Thing is, tech giants aren’t the biggest players in that space. Companies that are—including NEC, Cognitec, and Clearview AI—are continuing to sell their systems.  

And because facial recognition isn’t regulated, unless a company decides to tell us these tools exist or a journalist uncovers something, we won’t necessarily know what’s out there—let alone how it’s used, even when it’s used on us.

Clearview is sometimes referred to as the killer app of face ID. It’s also incredibly controversial. It quietly scraped billions of images off the internet—off Venmo, LinkedIn, Google, Facebook, Twitter, YouTube, and so on—and it built a system that’s believed to be widely used by law enforcement, including the FBI and ICE, as well as state and local police. 

There are many legal battles being fought over this practice called Web scraping—not because there's something inherently bad about it, it's just a tool for gathering data from the internet. It’s how websites offer price comparisons and real estate listings. It’s also how a great deal of public research gets done. 

So, the real issue is there aren’t really ground rules for what can be scraped. And the federal law most often used to sort out these cases? Well, it’s from 1986. Before the Web even existed.

Throughout this series we’re going to hear from folks who build technologies, as well as those who fight against them. And I sat down with Clearview’s chief executive to talk about all of this, from his perspective. 

Hoan Ton-That: My name is Hoan Ton-That and I'm the founder and CEO of Clearview AI.

Jennifer Strong: Ok, how would you describe your company, the technology, and what it does?

Hoan Ton-That: Basically it is a search engine for faces. So you upload a photo of a face and it finds publicly available links that are online. And right now it's used for law enforcement to solve crimes after the fact. So an officer, if they're stuck on a case and they have something from video footage, they can run it through our system and then start an investigation. 

Jennifer Strong: He says it immediately appealed to law enforcement, but that was after he shopped it around to a few different groups.

Hoan Ton-That: When we were building our facial recognition technology we explored many different ideas and many different sectors from private security to hospitality ... when we gave it to some people in law enforcement, the uptick was huge. And they called us back the next day and said, we're solving cases. This is crazy. In a week's time we had a really thick booklet. 

Jennifer Strong: So where did you get the idea to create Clearview. What were your motivations?

Hoan Ton-That: I've always loved to learn about computer programming since I was a kid looking at the MIT video lectures or doing open source projects and downloading images to train better models for facial recognition. And eventually that morphed into doing a facial search engine. And it was just a surprise to me how many people really didn't tackle this idea cause it's such a hard problem because you have to be very, very accurate but we stuck at it and it ended up working really well.

Jennifer Strong: There's a growing list of reasons why researchers might choose not to work on a search engine of faces. A big one is how that work might be applied.  

Kade Crockford: Take the case of Steve Talley, a financial analyst from Colorado. 

Jennifer Strong: Kade Crockford is a privacy advocate. This is from her Ted Talk. 

Kade Crockford: In 2015, Talley was charged with bank robbery on the basis of an error in a facial recognition system. Talley fought that case and he eventually was cleared of those charges, but he lost his house, his job and his kids. Steve Talley's case is an example of what can happen when the technology fails. But face surveillance is just as dangerous when it works as advertised. 

Just consider how trivial it would be for a government agency to put a surveillance camera outside a building where people meet for Alcoholics Anonymous meetings. It would be just as easy to use this technology to automatically identify every person who attended the Women's March or a Black Lives Matter protest. 

Jennifer Strong: Though Ton-That believes Clearview's tool is safer than what she’s describing.  

Hoan Ton-That: A false positive in a live setting is more of an issue than it is in an after the fact setting. Because if you're getting an alert and you're running down to find the person, you have maybe a lot less time to see if it's correct. Whereas if you're behind a desk doing an investigation, you have all the time in the world to make sure you're doing the right thing.  

Jennifer Strong: But there’s no agreement on what “doing the right thing” means.

In episode one we met Robert Williams. He was wrongly arrested in just the type of investigation Ton-That is talking about, after software incorrectly matched his driver's license photo to pictures of someone stealing watches. 

The tool used in the case of Mr. Williams wasn’t built by Clearview but by a company called DataWorks—though both of these systems rely on neural networks.

Hoan Ton-That: So a neural network is a newer form of artificial intelligence where instead of hard coding certain factors - for example we want to do facial recognition to find similar faces of the same person - instead of hard coding factors like the distance between your eyes or the distance between the eyes and the nose, it just learns from a ton of different examples. And what we do is we collect like a thousand examples of George Clooney or a thousand examples of Brad Pitt and the machine over time it learns the difference between those two faces and then it can apply it to a face that hasn't been seen before.

Jennifer Strong: Something everyone we spoke to for this series agreed on is that these systems work best when the lighting is good and cameras are placed at face height. 

But with security cameras, that’s rare. There’s also the challenge of scale.

Hoan Ton-That: How do you search billions of faces or vectors in under a second? Typical databases lookup by name and email. This looks up by similarity. And doing that at scale is hard. We had to get our own data center as well for that. Typically if you're buying a facial recognition system, there's a cold start problem. What photos do you put in there? So police departments might have their own mugshots but they don't have mugshots from other police departments. So it really limits the usefulness of it. And we just realized there's trillions and trillions of web pages on the internet and on social media and you know, news sites, mugshot websites.

Jennifer Strong: We’re just a few miles away from each other in New York City, but because of the pandemic we’re chatting over Zoom.

Hoan Ton-That: Jennifer, I had took a screenshot of a photo of you before. Do you mind if I upload it? 

Jennifer Strong: No, that’s fine. 

Jennifer Strong: And he puts an old photo from my LinkedIn account up on the screen.

Hoan Ton-That: So, on the left side? You see this new search button where you pick a photo? So this is the one I'm going to use. Don't worry. No one can see the screen except for us (laughter).

And so that took, you know, about a second. Uh, and you can see there's a link that you can click on. But as we go through this is, so that's, uh, from Twitter. Do you remember this photo at all?

Jennifer Strong: Ummm no I didn’t know that was taken... I look very...

Hoan Ton-That: Yeah. You do. You look very serious in that one. Yeah.

Here you're giving a talk at The Wall Street Journal, the Future of Everything.

Jennifer Strong: Yeah. 

Hoan Ton-That: You're interviewing someone here at Duke Health. So, like I said, all these things are publicly available. So, that's basically how it works.  

Jennifer Strong: There’s nothing unusual here. Just pictures of me at work reporting stories and hosting panels in different cities —though it is kind of jarring to find photos of myself I’ve never seen—and once again it brings up this thorny question of consent. 

You see, I'm unlikely to check a box giving permission to companies like Clearview to scrape my image and use it to build their businesses.

The thing is, they don't need it.

We’ll be back in a moment right after this.

Hoan Ton-That: What's unique about Clearview AI and that makes it a little harder for people to understand, is only searching publicly available information.  

Jennifer Strong: And this is where Ton-That makes an argument we may be debating and litigating for many years to come: that the open internet as we know it, including things like Google searches, wouldn’t really exist had we put restrictions on the use of online data.

Hoan Ton-That: LinkedIn's a billion dollar company, or trillion dollar company Microsoft, they don't have the right to block other people's access to public data. So it's a thing that is just kind of in an interesting spot because it is only searching publicly available information and things that people want to be private. We do know that we don't want to plaster them all over the internet. So I think that we have an instinct of what we want to keep private and what we want to keep public and that will always be the case.

Jennifer Strong: It’s safe to say not everyone agrees with him. 

Twitter is among a whole host of companies that sent Clearview a cease and desist order, telling it to stop scraping their images and delete all of their data. Twitter also says their policies prohibit their data from being used for facial recognition. Given he’s already scraped billions of images, you’d be forgiven for wondering how just much more information is still out there for him to capture.

But it’s only the tip of the iceberg. 

Hoan Ton-That: We're still not even 1% of what's out there when you run the numbers, it's kind of crazy how much information is out there. So when it comes to privacy, we kind of have to look at ourselves and say, well, we are voluntarily sharing a lot of this information. And that may be true, but how, how do we feel about it? We don’t have any private information like Google or Facebook does. Google has your location on Android all the time. Facebook knows all your habits and what you like and what you don't like. The Instagram explore tab is just like phenomenal at finding out what you like, it's kind of scary, but they kind of know a lot more information than what we do. And we're just focused on trying to apply for the greatest good, we think, to make the world a lot safer. 

Jennifer Strong: And to him that means working with police.

Hoan Ton-That: So already in a lot of law enforcement agencies that use facial recognition, they have a procedure saying you cannot just arrest someone based on a facial recognition match. You still have to do follow up research. So there's always a human in the loop that checks is this person, the right person? Do they have the right name? Does that person live in the same area where the crime was committed?

Jennifer Strong: But what about folks who are falsely accused? 

Ton-That would argue that’s a human failing. In the same way we’re still responsible for how we drive while using GPS—when the navigation says “turn right” and it’s not safe to do so—it’s up to us and our human brains to ignore it.

And he reminds us that people get it wrong too.

Hoan Ton-That: Like one example is human lineups. The Innocence Project says 70% of wrongful convictions are from eyewitness testimony. So, you know, if you're a bad  police officer and you want to frame someone for a crime they didn't do, you can kind of like edge people into picking the person you want out of a lineup and I think technology like Clearview can actually help add a lot of transparency and accountability. Something that we'd love to, to really take a strong look at is how it can actually help people not be misidentified.  

Jennifer Strong: In a way, this argument he’s making against human judgment? It's the same one that’s used against handing these decisions off to AI: there’s error, bias, and racism. But where Ton-That doesn’t believe we can reliably change people, he does believe he’s removed these things from his system. 

Hoan Ton-That: And so we believe we've totally solved the accuracy problem and the racial bias problem that have kind of plagued other facial recognition companies. And we want other people to know that we can really take this technology and use it. 

Jennifer Strong: These are very big claims that may not even be possible, and there’s currently no way to verify any of it. Clearview hasn’t provided the kind of public access that would allow their system to be audited in the same way that Amazon and others have. 

He says the company did its own audit, assembling an independent review board that took a similar approach to when the ACLU tested Amazon’s facial recognition system by running photos of the US Congress past a mugshot database—which are photos of people who’ve been arrested for a crime.

Hoan Ton-That: And so they ran this independent study. But instead of searching a gallery of 25,000 mugshots it searched a gallery of 2.8 billion photos at the time. And we did other state legislatures like New York state and Texas. And each of the results that came up were the correct person and they went through them individually. 

Jennifer Strong: From Clearview’s perspective, this means the technology might actually help the justice system become more fair. He says they got Jonathan Lippman, the Chief Judge of the New York Court of Appeals, to be part of that review panel. 

Hoan Ton-That: And he really believes that if you have something that's more accurate, it's better for the defendants as well. They're not going to go to jail for a crime they didn't commit.

Jennifer Strong: If Silicon Valley has a brand, it’s this techno-optimism about how their creations will change the world, but without the burden of being responsible for any unwanted changes that might go along with that.

Perhaps it really shouldn’t be the job of tech creators to worry about what kinds of transparency, oversight, and guardrails are needed to protect the public.  

Hoan Ton-That: I think it's the responsibility of government and policy makers to come up with regulations and tech companies should have a seat at the table, and it's in their interest to have a seat at the table. Sometimes you see bad policies passed because they don't know how the technology works. So, I think more tech companies are going to engage with policy.

We've had a lot of attention but we know we're doing the right thing. And I think in the long run, any kind of new technology is controversial from the printing press... and that's just part of the process. The choice is not between no facial recognition and facial recognition. It's between responsible facial recognition and a kind of a wild west. 

Jennifer Strong: One group aiming to help tame that wild west is The Police Executive Research Forum. The nonprofit has spent the last four decades helping police chiefs work through emerging issues.

Alexa Daniels-Shpall: The use of tasers. They use of body worn cameras. Now we're also looking at the topic of facial recognition.

Jennifer Strong: Alexa Daniels-Shpall is leading this research in partnership with the U-S Department of Justice. 

Alexa Daniels-Shpall: We’ve been doing a lot of this research in this area with the goal of developing some national guidelines. They’re using it in a variety different ways and they’ve sorta all developed their own procedures, protocols and policies.

Jennifer Strong: We’ll get into that next episode… But for now, the important thing is her research suggests that Clearview’s adoption by police departments may not be as widespread as claimed. In January he told the New York Times more than 600 law enforcement agencies have started using his product in the past year.

Alexa Daniels-Shpall: We’ve only seen a handful that have followed through with a formal contract.

Jennifer Strong: Bottom line: police agencies are using Clearview. But there’s a big difference between trying, and buying.

Alexa Daniels-Shpall: I know that some tried it out and then decided not to use it. And, at least some of them we spoke with said it just didn't work that well for them. And I think it probably depends on, you know, what you're testing it with and, and just where you are in the country. Cause, you know, I don't know that anyone has a sense of where like how many images are coming up in different areas. 

Jennifer Strong: And not all policing agencies that tried it out did so knowingly… or through official channels.

Alexa Daniels-Shpall: The executives found out that detectives had been approached by the company to test it out. And they then brought it to their bosses to say, Hey, we should look into moving forward with it. And the executives sort of said, we're going to shut that down for now, and go through our normal procurement and evaluation processes before we move forward. // It's been sort of a mixed reception I would say from different agencies. And the more important question would be to find out how many sort of permanent long term contracts, and how many agencies have done the formal procurement to make a relationship with the company rather than just those free trials that were going around.

Jennifer Strong: Next time, we’ll meet police around the US who are using face ID… 

Dori Koren: It is a little bit Hollywoodish but we did that on purpose that’s how we wanted it to feel. So, imagine walking into a large room and on the front wall, you have this massive display -  all kinds of camera feeds like a surveillance room, but a little bit more high tech, a little bit bigger, a little bit more advanced. 

Jennifer Strong: And find out what role actor Woody Harrelson and other celebrities unwittingly play in naming police suspects...

This episode was reported and produced by me, with Tate Ryan-Mosely and Emma Cillekens, with special thanks to Karen Hao and Benji Rosen. We’re edited by Michael Reilly and Gideon Litchfield. Our technical director is Jacob Gorski. 

Thanks for listening, I’m Jennifer Strong. 

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.