Deepfakes have become a symbol for the end of truth and, to some, a potential tool to swing elections. (Never mind that most deepfakes are still fake porn.) Everyone from the US government to tech giants to startups is trying to develop deepfake-busting technology. But a new report out today from Witness, a nonprofit that studies synthetic media, points out how these tools could go wrong.
The techniques: Manipulated video is not a new issue, and there are plenty of social problems that even the best deepfake detector can’t fix. (For example, knowing that a video has been edited doesn’t automatically answer the question of whether it should be taken down. What if it’s satire?) That hasn’t prevented companies like Amber Video, Truepic, and eWitness from developing “verified-at-capture” or “controlled-capture” technologies. These use a variety of techniques to sign, geotag, and time-stamp an image or video when it’s created. In theory, this makes it easier to tell if the media has been tampered with.
What’s the problem? The Witness report lays out 14 different ways that these technologies could actually end up being harmful. Some of the key ones:
—The tools being built could be used to surveil people
—Technical restraints could stop these tools from working in places where they’re most needed (and those using old hardware could be left behind)
—Jailbroken devices won’t be able to capture verifiable material
—Companies could delete the data or not let individuals control it
—Requiring more verification for media in court could make the legal process longer and more expensive
So what can be done? There’s no easy solution to these problems, says Witness program director Sam Gregory. The companies building these technologies must address these questions and think about the people who are most likely to be harmed, he adds. It is also possible to build synthetic media tools themselves in a more ethical way. Technology expert Aviv Ovadya, for instance, has ideas for how to make responsible deepfake tools. Companies can do their best to vet which clients are allowed to use their tools and explicitly penalize those who violate their norms. Synthetic media of all kinds are going to become more common. It’ll take a lot of different tactics to keep us all safe.
This startup wants to copy you into an embryo for organ harvesting
With plans to create realistic synthetic embryos, grown in jars, Renewal Bio is on a journey to the horizon of science and ethics.
VR is as good as psychedelics at helping people reach transcendence
On key metrics, a VR experience elicited a response indistinguishable from subjects who took medium doses of LSD or magic mushrooms.
This artist is dominating AI-generated art. And he’s not happy about it.
Greg Rutkowski is a more popular prompt than Picasso.
This nanoparticle could be the key to a universal covid vaccine
Ending the covid pandemic might well require a vaccine that protects against any new strains. Researchers may have found a strategy that will work.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.