Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Despite warnings from many high-school teachers and college professors, Wikipedia is one of the most-visited websites in the world (not to mention the biggest encyclopedia ever created). But even as Wikipedia’s popularity has grown, so has the debate over its trustworthiness. One of the most serious concerns remains the fact that its articles are written and edited by a hidden army of people with unknown interests and biases.

Ed Chi, a senior research scientist for augmented social cognition at the Palo Alto Research Center (PARC), and his colleagues have now created a tool, called WikiDashboard, that aims to reveal much of the normally hidden back-and-forth behind Wikipedia’s most controversial pages in order to help readers judge for themselves how suspect its contents might be.

Wikipedia already has procedures in place designed to alert readers to potential problems with an entry. For example, one of Wikipedia’s volunteer editors can review an article and tag it as “controversial” or warn that it “needs sources.” But in practice, Chi says, relatively few articles actually receive these tags. WikiDashboard instead offers a snapshot of the edits and re-edits, as well as the arguments and counterarguments that went into building each of Wikipedia’s many million pages.

The researchers began by investigating pages already tagged as “controversial” on Wikipedia: they found that these pages were far more likely to have been edited and re-edited repeatedly. Based on this observation, they developed WikiDashboard, a website that serves up Wikipedia entries but adds a chart to the top of each page revealing its recent edit history.

WikiDashboard shows which users have contributed most edits to a page, what percentage of the edits each person is responsible for, and when editors have been most active. A WikiDashboard user can explore further by clicking on a particular editor’s name to see, for example, how involved he or she has been with other articles. Chi says that the goal is to show the social interaction going on around the entry. For instance, the chart should make it clear when a single user has been dominating a page, or when a flurry of activity has exploded around a particularly contentious article. The timeline on the chart can also show how long a page has been neglected.

29 comments. Share your thoughts »

Credits: Technology Review, WikiDashboard/PARC

Tagged: Communications, Web, social media, visualization, Wikipedia, wiki, PARC, online collaboration

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me