Despite warnings from many high-school teachers and college professors, Wikipedia is one of the most-visited websites in the world (not to mention the biggest encyclopedia ever created). But even as Wikipedia’s popularity has grown, so has the debate over its trustworthiness. One of the most serious concerns remains the fact that its articles are written and edited by a hidden army of people with unknown interests and biases.
Ed Chi, a senior research scientist for augmented social cognition at the Palo Alto Research Center (PARC), and his colleagues have now created a tool, called WikiDashboard, that aims to reveal much of the normally hidden back-and-forth behind Wikipedia’s most controversial pages in order to help readers judge for themselves how suspect its contents might be.
Wikipedia already has procedures in place designed to alert readers to potential problems with an entry. For example, one of Wikipedia’s volunteer editors can review an article and tag it as “controversial” or warn that it “needs sources.” But in practice, Chi says, relatively few articles actually receive these tags. WikiDashboard instead offers a snapshot of the edits and re-edits, as well as the arguments and counterarguments that went into building each of Wikipedia’s many million pages.
The researchers began by investigating pages already tagged as “controversial” on Wikipedia: they found that these pages were far more likely to have been edited and re-edited repeatedly. Based on this observation, they developed WikiDashboard, a website that serves up Wikipedia entries but adds a chart to the top of each page revealing its recent edit history.
WikiDashboard shows which users have contributed most edits to a page, what percentage of the edits each person is responsible for, and when editors have been most active. A WikiDashboard user can explore further by clicking on a particular editor’s name to see, for example, how involved he or she has been with other articles. Chi says that the goal is to show the social interaction going on around the entry. For instance, the chart should make it clear when a single user has been dominating a page, or when a flurry of activity has exploded around a particularly contentious article. The timeline on the chart can also show how long a page has been neglected.
The page on Hillary Clinton, for example, shows that the main contributor has put in about 20 percent of the edits. Chi says this suggests that this individual has guided a lot of the article’s direction. In contrast, an entry on windburn shows a much less heated scene: more even collaboration among the contributors.
The researchers released an early version of the tool in 2007 using data released a few times a year by Wikipedia. But Chi says that this version of WikiDashboard was limited, since it didn’t show the speed of change online. His team spent much of 2008 getting access to live data, which Chi says was difficult because of Wikipedia’s limited resources.
Daniel Tunkelang, chief scientist at Endeca, an information analysis firm based in Cambridge, MA, says that the tool is a step toward exploring the social context of Wikipedia entries, but he adds, “There’s some room for compressing this into something more consumable.” By this, Tunkelang means that the software could be more useful to the casual user if it summarized data more effectively. For example, he says that the list of articles that each editor has worked on could be shown as just a handful of easy-to-read tags.
At a talk given by Chi this week, Rob Miller, an associate professor at MIT’s Computer Science and Artificial Intelligence Lab, noted that some Wikipedia editors try to rank up a high number of edits just to gain more kudos. He wondered how that tendency might affect WikiDashboard’s measurements should the tool catch on.
Chi’s group is still working on the WikiDashboard, and on Wikipedia data more generally. He says that he’d like to see a system that measures not just simple statistics such as the number of edits made, but also the quality of those contributions.
10 Breakthrough Technologies 2024
Every year, we look for promising technologies poised to have a real impact on the world. Here are the advances that we think matter most right now.
Scientists are finding signals of long covid in blood. They could lead to new treatments.
Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.
AI for everything: 10 Breakthrough Technologies 2024
Generative AI tools like ChatGPT reached mass adoption in record time, and reset the course of an entire industry.
OpenAI teases an amazing new generative video model called Sora
The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.