Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

The official motto of the Internet could be “don’t believe everything you read,” but moves are afoot to help users know better what to be skeptical about and what to trust.

A tool called WikiTrust, which helps users evaluate information on Wikipedia by automatically assigning a reliability color-coding to text, came into the spotlight this week with news that it could be added as an option for general users of Wikipedia. Also, last week the Wikimedia Foundation announced that changes made to pages about living people will soon need to be vetted by an established editor. These moves reflect a broader drive to make online information more accountable. And this week the World Wide Web Consortium published a framework that could help any Web site make verifiable claims about authorship and reliability of content.

WikiTrust, developed by researchers at the University of California, Santa Cruz, color-codes the information on a Wikipedia page using algorithms that evaluate the reliability of the author and the information itself. The algorithms do this by examining how well-received the author’s contributions have been within the community. It looks at how quickly a user’s edits are revised or reverted and considers the reputation of those people who interact with the author. If a disreputable editor changes something, the original author won’t necessarily lose many reputation points. A white background, for example, means that a piece of text has been viewed by many editors who did not change it and that it was written by a reliable author. Shades of orange signify doubt, dubious authorship, or ongoing controversy.

Luca de Alfaro, an associate professor of computer science at the UC Santa Cruz who helped develop WikiTrust, says that most Web users crave more accountability. “Fundamentally, we want to know who did what,” he says. According to de Alfaro, WikiTrust makes it harder to change information on a page without anyone noticing, and it makes it easy to see what’s happening on a page and analyze it.

The researchers behind WikiTrust are working on a version that includes a full analysis of all the edits made to the English-language version of Wikipedia since its inception. A demo of the full version will be released within the next couple months, de Alfaro says, though it’s still uncertain whether that will be hosted on the university’s own servers or by the Wikimedia Foundation. The principles used by WikiTrust’s algorithms could be brought onto any site with collaboratively created content, de Alfaro adds.

2 comments. Share your thoughts »

Credit: University of California, Santa Cruz

Tagged: Communications, Web, Wikipedia

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me