Skip to Content

Augur, a new blockchain-based prediction market platform, is getting a lot of media attention because people are using it to predict the deaths of celebrities. But despite fears that the rise of “assassination markets” could inspire real killings, the more urgent problem Augur presents is something else entirely.

This piece first appeared in our twice-weekly newsletter Chain Letter, which covers the world of blockchain and cryptocurrencies. Sign up hereit’s free!

Really, people saw these sorts of “death pools” coming decades ago, and blockchains, with their decentralized networks and (potentially) anonymous transactions, serve as an ideal platform. Augur’s open-source software relies on blockchain-based computer programs called smart contracts to let users set up their own prediction markets that automatically pool cryptocurrency bets and distribute winnings without the need for participants to identify themselves. Perfect for ginning up interest in offing someone by guaranteeing a payday to whoever does the deed, at least in theory. Predictably, the Ethereum-based protocol, which launched July 10, has already led to markets for forecasting the demise of Donald Trump, Jeff Bezos, Warren Buffett, Betty White, and others. But these markets have seen very few transactions, and the amounts wagered have been tiny, making it unlikely they’d inspire someone to engage in foul play.

Nevertheless, Augur may already be facilitating illegal activity that could prove far more troublesome.

In the US, prediction markets are generally not permitted. Federal and state laws prohibit online gambling, and “in many ways the line between prediction markets and gambling is not that clear,” says Aaron Wright, a professor at the Cardozo School of Law in New York City. Further, some Augur contracts allow users to bet on the future value of something, such as Ether cryptocurrency. That sounds a lot like a type of investment called a binary option, which is unlawful to list without approval from the Commodity Futures Trading Commission. In 2012, the CFTC sued Intrade, an Ireland-based prediction market, accusing it of permitting US users to trade binary options, and eventually a judge blocked Intrade from offering the contracts in the US.

Sure enough, Augur already has the CFTC’s attention. But even if the agency decides that Augur is breaking the law, how will it enforce that decision? Augur’s creators claim they don’t have control over what its users choose to do with the protocol—or the ability to shut it down. This creates a problem that is “endemic” to blockchain technology, says Wright, who recently co-wrote a book on the subject: “If you do not have a very concrete intermediary—i.e., a company or group of people that are running the marketplace—how do you apply laws and prevent that activity from occurring?” 

When Napster, Limewire, and other peer-to-peer file-sharing networks started slinging music, movies, and other files around the internet two decades ago, they created similar law enforcement headaches. But in each case there was an entity that could be sued for copyright infringement. Software like Augur, which is open-source, freely downloadable, and run on a blockchain, presents genuinely new challenges, says Wright. 

That doesn’t mean laws can’t be applied—just that they’ll need to be applied in different ways. For instance, if officials were to rule that Augur was facilitating illicit activity, they might try to go after the people who developed the software, much as malware developers have been held liable for their creations, says Wright. That would be likely to set up a fight over First Amendment protections. Prosecutors could also try to target the users who keep the protocol running. Called “reporters,” they use Augur’s tradable crypto-token, REP, to report outcomes and are rewarded with more tokens if their reports are consistent with the larger consensus. There are potentially other avenues, as well, says Wright: “Just because there is no center doesn’t mean there aren’t indirect ways to attack lawless activity.”

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.