Skip to Content

AI was supposed to make police bodycams better. What happened?

New AI programs that analyze bodycam recordings promise more transparency but are doing little to change culture.

thumbnails of frames from bodycam videos in a grid

On July 25 last year, in circuit court in Dane County, Wisconsin, a motion was filed to dismiss a criminal case as a result of what defense attorneys described as “institutional bad-faith actions” by a local police department. The evidence was unearthed, in part, because of artificial intelligence. 

Attorney Jessa Nicholson Goetz had been preparing to defend her client against a sexual assault charge that arose from a 2021 Tinder date. During pretrial motions, Nicholson Goetz’s co-counsel noticed discrepancies around how the lead police investigator was discussing and documenting his use of a body camera, which department policy required him to wear at all times. Nicholson Goetz asked to review footage connected to the investigation; she said the police department delivered 40 hours of video before the trial began.

This kind of data dump is commonplace, typically right before the start of a trial. Manually reviewing body camera footage isn’t always a useful source of insight into a case; more often, especially for defense attorneys without resources, it’s a nightmare. Actually watching the tapes is very time consuming, and paying to have them transcribed can add tens of thousands of dollars to tight budgets. 

But this time, Nicholson Goetz and her team were using JusticeText, an AI-powered evidence management program that two former University of Chicago computer science students named Devshi Mehrotra and Leslie Jones-Dove developed when they were outraged at the police killing of Laquan McDonald in their city in 2014. JusticeText analyzes the audio from bodycam footage, transcribes it, and marks it up in minutes, not hours. Released in 2021, it now aids private criminal attorneys as well as public defenders in states such as Texas, Massachusetts, and Kentucky.

Although it did not reveal anything that would have directly proved her client’s innocence, JusticeText did bring to light possible evidence of police malfeasance—specifically, destruction of “apparently and potentially exculpatory evidence,” according to the motion. 

After going through just a few JusticeText analyses of videos, Nicholson Goetz froze when she read the following transcribed directions the investigator gave to the alleged victim, who was a prosecution witness: “Okay, I’m trying to speak vaguely here. Um. Just because, you know, I don’t want this on record.” The witness would later ask the investigator whether they were speaking in confidence, and he responded that he wasn’t going to put their conversation into police reports. In light of this exchange and what Nicholson Goetz describes in the motion as the department’s alleged propensity to “mishandle, destroy, intentionally omit, recklessly fail to preserve evidence,” including camera footage from the night in question that could have had exculpatory value, she filed the motion to dismiss. The judge would eventually dismiss the case, noting in a decision on March 8 that “[the defendant’s] defense has suffered irreparable prejudice due to [the investigator’s] actions.”

“Without JusticeText, the trial would have started, instead of being delayed and dismissed,” Nicholson Goetz says. “This has changed the way that I do my discovery, because now I’m really very curious as to what’s out there.”

Bodycams have generated millions of hours of video footage, most of which goes unwatched.

No one is entirely sure what is out there. When police departments began buying and deploying bodycams in earnest in the wake of the police killing of Michael Brown in Ferguson, Missouri, in 2014, activists hoped it would bring about real change. The cameras were originally hailed as “supervisory force multipliers,” says Seth Stoughton, a law professor at the University of South Carolina who studies the technology. 

Years later, despite what’s become a multibillion-dollar market for these devices, the tech is far from a panacea.

Part of the problem is scale: bodycams have generated millions of hours of video footage, most of which goes unwatched. Police departments also systematically delay releasing footage, and they often refuse to discipline officers who fail to wear the cameras properly. And when they do finally provide video to the public, it’s often selectively edited, lacking context and failing to tell the complete story. A recent New York Times analysis concluded that bodycams “may do more to serve police interests than those of the public they are sworn to protect.”

A handful of AI startups see this problem as an opportunity to create what are essentially bodycam-to-text programs for different players in the legal system, mining this footage for misdeeds. That could help improve professionalism within the police. But like the bodycams themselves, the technology still faces procedural, legal, and cultural barriers to success. 

In essence, bodycam analysis programs work in three steps. First, a speech recognition algorithm turns the audio into draft text; after recordings get broken up into phonemes, the smallest unit of sound, a probability analysis determines how these building blocks come together in words and sentences. Then a machine-learning algorithm and natural-language-processing programs, trained on vast collections of text and previous conversations, clean up the rough copy and remove errors. Finally, the system checks that finished copy for specific keywords and patterns, which are flagged and analyzed. 

The true test of the technology will be whether it can accurately scan and scour hours and hours of unwatched video. But a more important question is whether it can meet a bigger challenge: moving the process and culture of policing toward accountability. 

JusticeText is tackling these issues at the court level. During its development, cofounder Mehrotra and her colleagues surveyed public defenders and found that roughly 80% of their cases involved data of some sort. Most defenders, serving dozens of defendants simultaneously, couldn’t ingest this type of evidence, understand it, and present it in court, putting them at a disadvantage. To help address the problem, JusticeText spent time building tools to set up timelines for attorneys and making sure the system was able to read files from different vendors and police departments. 

Mehrotra says she’s hearing from attorneys who have gotten high-level felony cases dismissed outright after using the program to analyze bodycam recordings. Before, she says, they didn’t have time to pull clips from these files. Now they’re walking into court with highlight reels.

Other firms working on similar technology think its best use is in helping the police more effectively police themselves. Polis Solutions is developing an AI tool to analyze not only audio but also facial expressions caught by these cameras. Another company, Truleo, offers tailored video-to-text transcription services for police departments. Cofounder Anthony Tassone envisions flagging incidents that need review, bringing attention to positive policing, and sharing video clips for training; he hopes to create a “TikTok for cops.” His sales pitch has connected with law enforcement. Currently, 28 departments use the service, including New York City’s. The NYPD signed up for a Truleo trial last October; 1,000 of the force’s roughly 36,000 officers are set to be tracked and trained with the system.

A tech-savvy stock trader who has been building natural-­language-processing models for decades, Tassone has long been dedicated to police foundations and causes, donating to fundraisers and serving on the board of the FBI National Academy. The police chiefs he met while fundraising had common complaints: a shortage of supervisors, challenges with new recruits who didn’t meet current performance standards, and the difficulty of rewarding good policing. Tassone saw AI as a way to help address these challenges. Turn the tech loose on the audio of bodycam footage, he thought, and you could create transcripts that supervisors could use to track, study, and improve officers’ interactions with members of the public. 

“Everyone’s perception was that bodycam footage was a liability,” Tassone says. “They [police department higher-ups] didn’t want to view it unless there’s a horrific use-of-force or civilian complaint. I just thought that was dumb. This is like game-time footage of all your athletes.”

Tassone claims that Truleo, which hit the market in 2021, can identify events like an officer frisking someone or reading Miranda rights to a suspect, and calculate a professionalism score. The software doesn’t eliminate human review, he says; it augments it. Police chiefs or supervisors set up lists of keywords or events, get emails and notifications when the system detects these triggers, and then review the footage. Truleo’s tech is installed on department servers, so the data remains sequestered.  

In the company’s own studies, Tassone claims, officers monitored by Truleo always score better than the control group; a study of one client, the police department in Alameda, California, found a 36% reduction in uses of force. No third-party analyses of Truleo have yet been completed; researchers at the nonprofit RTI are currently studying its analysis of bodycam footage from Georgia state parole and probation officers, but results aren’t expected anytime soon. Secure Justice, a nonprofit based in Oakland, California, that focuses on police tech and abuses of power, briefly considered pushing a bill to mandate the use of Truleo across the state, but executive director Brian Hofer says the group hadn’t “done sufficient due diligence at this stage to be comfortable making an aggressive move like that” and may revisit the idea in 2025. 

“It just opens up law enforcement’s frame of surveillance in a way that we haven’t really previously had to deal with.”

Beryl Lipton, investigative researcher, Electronic Frontier Foundation

Still, Hofer suspects the technology does work. In fact, that very efficacy may be one reason it hasn’t been universally welcomed: drama has erupted within two police departments that used and then dropped Truleo. In Vallejo, California, officers and police union officials objected to the introduction of the technology, with its potential to reveal unsavory behavior, and blamed it for inaccuracies and labor violations. The controversy helped accelerate the departure of the department’s reformist chief, Shawny Williams, last July. In Seattle, where the police department also canceled its contract with Truleo amid union objections, an officer was caught on bodycam footage last fall mocking a woman’s death; Truleo had flagged the incident.

Police officers aren’t the only ones with reasons to question this technology, though. The growing use of bodycam-to-text programs, along with increased use of cameras and drones, further normalizes surveillance by law enforcement, adding more everyday interactions to a searchable, indexable database. Jennifer Lee, former manager of the technology and liberty project at the ACLU of Washington, said in a statement that “the potential to use AI technology for purposes other than accountability raises significant questions that must be addressed.” 

“It just opens up law enforcement’s frame of surveillance in a way that we haven’t really previously had to deal with so much but increasingly have to deal with constantly,” says Beryl Lipton, an investigative researcher at the Electronic Frontier Foundation, a nonprofit digital rights group. The recording, transcription, and cataloguing of what someone says on the street in public during interactions with police raises a red flag, she says. She also points to concerns about bias and inaccuracy in the technology itself that arose when phone calls from prisoners were recorded, analyzed, and later made searchable via AI.

It’s difficult to fully address such concerns because, as with many AI systems, the exact way these bodycam-to-text systems work remains opaque, and it’s all the more so when outsiders can’t know what terms police departments are searching for. Besides, the significance of their findings depends on context, says Rob Voigt, a Northwestern University researcher and linguistics expert, who coauthored a 2017 paper that used bodycam footage to measure racial disparities in police attitudes toward minorities.

“Can you make a computational model that can identify what’s a threat or not? That’s really complicated,” he says.

Police departments have little incentive to ask pointed questions about racial bias, Voigt says, and those questions are central to reforming the policing model. The promise of these bodycam-­to-text programs won’t be fully met if key terms, phrases, and interactions aren’t commonly tagged and analyzed. And even if they are, trusting law enforcement to do the right thing with whatever data a system like Truleo or Polis produces might be misguided. 

Even Truleo’s Tassone can’t answer the “Who’s watching the watchmen?” question in a wholly satisfying way.

Tassone claims that Truleo has identified otherwise unrecorded use-of-force incidents or police pursuits, but he declines to provide examples. When pressed about whether those incidents get reported to the public, he said, “We have no idea what happens downstream from our technology. That’s on the department to follow up with that. Our mission is to give them the tools to find these things.”

Tools are no good unless they’re used, of course, and the ability to analyze police performance isn’t helpful if bodycam recordings don’t exist in the first place. Cops can simply turn off the camera or “forget” to turn it on, in which case there is simply nothing to examine. And since Truleo sequesters each department’s data, there’s no opportunity to do any meta-­analysis of police performance, which academics say would provide a gold mine of data. Advocates for police reform also can’t, say, file a public records request and use the tool to comb archival footage for the past actions of certain cops, or find patterns of misbehavior; no third party is using it to watch how cops are behaving. EFF’s Lipton says if the technology works as advertised, and its makers really want to improve public trust, it seems like “a waste of money and a missed opportunity” not to make it available to the public. 

Finally, law enforcement will likely continue to push back against the release of bodycam recordings, and slow-walk and delay the release of any questionable footage. 

“We need to worry about applying technological solutions to sociological problems,” says the University of South Carolina’s Stoughton. “Nothing about Truleo will fundamentally change the culture of a police department, or alter a public commitment to accountability that doesn’t have behind-the-doors follow-through.” 

Truleo’s track record when it comes to changing the culture of law enforcement is decidedly mixed so far. The police force in Castle Shannon, Pennsylvania, has used the system for roughly two years with generally positive results. Chief of police Ken Truver, a self-described evangelist for law enforcement tech, says that even with just 14 officers in his department, actually watching and utilizing hours of bodycam footage becomes overwhelming. He’s constantly thinking about what he’s missing, since even mandated spot checks involve only about 1% of what’s stored. Truleo helps him efficiently review behavior; every day at 6 p.m. he gets an email with incidents to flag and check. He claims he hasn’t found any untoward behavior.

“Police are generally cynical of technology,” he says. “The first thing they’re gonna say is, ‘How could I possibly get in trouble with this technology?’ The purpose of the technology is not to find an officer doing something wrong. Matter of fact, the opposite is true. I want to find my officers doing something right.”

But the rank-and-file officers in Vallejo, California, argued that the technology did the reverse. Their frustration with the system was one factor that led to the departure of the chief responsible for setting up Truleo. Michael Nichelini, president of the Vallejo Police Officers’ Association, feels the technology was meant for discipline. He’s not against transparency, he says, but he’s sick of efforts to change the department’s culture. He’s frustrated with poor pay and understaffing, and he objects to what he sees as efforts to fix those issues with software instead of addressing them with better leadership and compensation. In the Seattle case, too, a police union president accused department leaders of using the tech to “spy” on officers.  

Tassone, who was raised in a police and military family on Chicago’s South Side, dismisses the privacy complaints from cops. 

“Nothing they say on camera is considered private. That’s all public records,”  he says. “All my work on Wall Street for 20 years, all my emails, had identical employee quality assurance monitoring in place. The idea that police wouldn’t have it when they have guns and badges, ability to take people’s rights—nobody’s going to support that.”

Truleo takes issues of privacy for police and civilians very seriously; video is never analyzed, and personally identifiable information is automatically scrubbed. “We need to be super sensitive and super delicate around how the technology is deployed to maximize civilian privacy,” Tassone says.

He believes these kinds of safeguards make the proposition simply irresistible: better transparency for a low monthly fee. He expects that local and state lawmakers will mandate the use of Truleo or other bodycam-to-text software. 

“This is going to be the future of policing,” he says. “It’ll be the law within five years.”

Patrick Sisson, a Chicago expat living in Los Angeles, covers technology and urbanism.

Keep Reading

Most Popular

How to opt out of Meta’s AI training

Your posts are a gold mine, especially as companies start to run out of AI training data.

Why does AI hallucinate?

The tendency to make things up is holding chatbots back. But that’s just what they do.

How a simple circuit could offer an alternative to energy-intensive GPUs

The creative new approach could lead to more energy-efficient machine-learning hardware.

How gamification took over the world

Gamification was always just behaviorism dressed up in pixels and point systems. Why did we fall for it?

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.