Software trained to know the difference between an honest mistake and intentional vandalism is being rolled out in an effort to make editing Wikipedia less psychologically bruising. It was developed by the Wikimedia Foundation, the nonprofit organization that supports Wikipedia.
One motivation for the project is a significant decline in the number of people considered active contributors to the flagship English-language Wikipedia: it has fallen by 40 percent over the past eight years, to about 30,000. Research indicates that the problem is rooted in Wikipedians’ complex bureaucracy and their often hard-line responses to newcomers’ mistakes, enabled by semi-automated tools that make deleting new changes easy (see “The Decline of Wikipedia”).
Aaron Halfaker, a senior research scientist at Wikimedia Foundation who helped diagnose that problem, is now leading a project trying to fight it, which relies on algorithms with a sense for human fallibility. His ORES system, for “Objective Revision Evaluation Service,” can be trained to score the quality of new changes to Wikipedia and judge whether an edit was made in good faith or not.
Halfaker invented ORES in hopes of improving tools that help Wikipedia editors by showing recent edits and making it easy to undo them with a single click. The tools were invented to meet a genuine need for better quality control after Wikipedia became popular, but an unintended consequence is that new editors can find their first contributions wiped out without explanation because they unwittingly broke one of Wikipedia’s many rules.
ORES can allow editing tools to direct people to review the most damaging changes. The software can also help editors treat rookie or innocent mistakes more appropriately, says Halfaker. “I suspect the aggressive behavior of Wikipedians doing quality control is because they’re making judgments really fast and they’re not encouraged to have a human interaction with the person,” he says. “This enables a tool to say, ‘If you’re going to revert this, maybe you should be careful and send the person who made the edit a message.’”
ORES is up to speed on the English, Portuguese, Turkish, and Farsi versions of Wikipedia so far. To learn to judge the quality of edits and distinguish damaging edits from innocent mistakes, it drew on data generated by Wikipedia editors who used an online tool to label examples of past edits. Some of the Wikipedians who maintain editing tools have already begun experimenting with the system.
Earlier efforts to make Wikipedia more welcoming to newcomers have been stymied by the very community that’s supposed to benefit. Wikipedians rose up in 2013 when Wikimedia made a word-processor-style editing interface the default, forcing the foundation to make it opt-in instead. To this day, the default editor uses a complicated markup language called Wikitext.
Halfaker believes his new algorithmic editing assistant will be accepted, because although it’s more sophisticated than previous software unleashed on Wikipedia, it isn’t being forced on users. “In some ways it’s weird to introduce AI and machine learning to a massive social thing, but I don’t see what we’re doing as any different to making other software changes to the site,” he says. “Every change we make affects behavior.”
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
How to fix the internet
If we want online discourse to improve, we need to move beyond the big platforms.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.