Skip to Content
Uncategorized

Who’s Messing with Wikipedia?

The back-and-forth behind controversial entries could help reveal their true value.
February 6, 2009

Despite warnings from many high-school teachers and college professors, Wikipedia is one of the most-visited websites in the world (not to mention the biggest encyclopedia ever created). But even as Wikipedia’s popularity has grown, so has the debate over its trustworthiness. One of the most serious concerns remains the fact that its articles are written and edited by a hidden army of people with unknown interests and biases.

Ed Chi, a senior research scientist for augmented social cognition at the Palo Alto Research Center (PARC), and his colleagues have now created a tool, called WikiDashboard, that aims to reveal much of the normally hidden back-and-forth behind Wikipedia’s most controversial pages in order to help readers judge for themselves how suspect its contents might be.

Wikipedia already has procedures in place designed to alert readers to potential problems with an entry. For example, one of Wikipedia’s volunteer editors can review an article and tag it as “controversial” or warn that it “needs sources.” But in practice, Chi says, relatively few articles actually receive these tags. WikiDashboard instead offers a snapshot of the edits and re-edits, as well as the arguments and counterarguments that went into building each of Wikipedia’s many million pages.

The researchers began by investigating pages already tagged as “controversial” on Wikipedia: they found that these pages were far more likely to have been edited and re-edited repeatedly. Based on this observation, they developed WikiDashboard, a website that serves up Wikipedia entries but adds a chart to the top of each page revealing its recent edit history.

WikiDashboard shows which users have contributed most edits to a page, what percentage of the edits each person is responsible for, and when editors have been most active. A WikiDashboard user can explore further by clicking on a particular editor’s name to see, for example, how involved he or she has been with other articles. Chi says that the goal is to show the social interaction going on around the entry. For instance, the chart should make it clear when a single user has been dominating a page, or when a flurry of activity has exploded around a particularly contentious article. The timeline on the chart can also show how long a page has been neglected.

Courting controversy: WikiDashboard gathers information about the social interactions underlying Wikipedia entries and displays it to a user. The entry for former U.S. president George W. Bush, shown above, stood out as the most controversial. The researchers discovered that certain statistics, such as the number of total revisions made to an article, could accurately predict controversy.

The page on Hillary Clinton, for example, shows that the main contributor has put in about 20 percent of the edits. Chi says this suggests that this individual has guided a lot of the article’s direction. In contrast, an entry on windburn shows a much less heated scene: more even collaboration among the contributors.

The researchers released an early version of the tool in 2007 using data released a few times a year by Wikipedia. But Chi says that this version of WikiDashboard was limited, since it didn’t show the speed of change online. His team spent much of 2008 getting access to live data, which Chi says was difficult because of Wikipedia’s limited resources.

Daniel Tunkelang, chief scientist at Endeca, an information analysis firm based in Cambridge, MA, says that the tool is a step toward exploring the social context of Wikipedia entries, but he adds, “There’s some room for compressing this into something more consumable.” By this, Tunkelang means that the software could be more useful to the casual user if it summarized data more effectively. For example, he says that the list of articles that each editor has worked on could be shown as just a handful of easy-to-read tags.

At a talk given by Chi this week, Rob Miller, an associate professor at MIT’s Computer Science and Artificial Intelligence Lab, noted that some Wikipedia editors try to rank up a high number of edits just to gain more kudos. He wondered how that tendency might affect WikiDashboard’s measurements should the tool catch on.

Chi’s group is still working on the WikiDashboard, and on Wikipedia data more generally. He says that he’d like to see a system that measures not just simple statistics such as the number of edits made, but also the quality of those contributions.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.