Skip to Content

Renegade Encyclopedia

Cofounder Jimmy Wales updates Technology Review on Wikipedia.
August 8, 2006

Wikipedia, the online encyclopedia for which anyone may write and edit, is now in its sixth year and has nearly 1.3 million articles in English. Recently, Wikipedians from around the world gathered in Cambridge, MA, to discuss, among other things, how to make the enormous online encyclopedia more accurate, more organized, and easier to use. Author and Web expert Lawrence Lessig refered to the conference, known as Wikimania 2006, as “the Woodstock of the 21st century.”

Jimmy Wales, cofounder of Wikipedia. (Credit: Andrew Lih)

Technology Review asked Jimmy Wales, the encyclopedia’s cofounder, to update us on his project. His message: Wikipedia is akin to rock-and-roll music in the 1950s: many people are skeptical of it because it’s unconventional. It’s also not perfect, but with the help of its software engineers, administrators, and contributors, it will get better, he says. Given time, Wales says, Wikipedia will be as significant as Elvis, doing for the reference book world what the king of rock-and-roll did for music.

Technology Review: Larry Sanger, one of your early collaborators on Wikipedia, once described Wiki entries as not necessarily the truth, but at best, a kind of “received truth,” or the consensus view on a subject. Do you agree, and do you think this is problematic?

Jimmy Wales: I don’t think “received truth” has ever been clearly defined. When you have thoughtful, reasonable individuals discussing how to present things in a way that’s satisfactory to a broad range of people, and when you use scholarly standards, such as requiring authors to cite sources, the end result isn’t very different from a traditional reference work.

TR: Sometimes Wikipedia entries are not the received truth. They’re not even accurate, by the best lights of the consensual view.

JW: Wikipedia is a work in progress. Mistakes are made during the editing process – sometimes before they have time to be corrected. There are errors in any large-scale human product. I think people have the wrong idea of how accurate traditional reference works are. In the study done by the journal Nature last December, experts seriously looking for errors found about three errors per article in the Encyclopedia Britannica.

The real question is: Does our process weed out errors? The study that hasn’t been done, but would be worth doing, is comparing 100 random Wikipedia articles to themselves two, three, and four years ago, to see the trajectory. I think you’d see dramatic improvements in almost every case. In some cases, you may find we had it better a year ago, which means something went wrong with our system.

TR: The Nature study also found about four errors per Wikipedia article compared with Britannica’s three. Why is Wikipedia wrong more often than conventional encyclopedias?

JW: Because it’s new. If you look at the best articles on Wikipedia – the work that has had the most attention, diverse contributors, and healthy dialogue – they’re significantly better than conventional encyclopedias. In areas where we’re not as good, we’re improving in that direction.

TR: What do you say to educators who tell students not to use Wikipedia for their papers because it isn’t a reputable or reliable source?

JW: I say that in the 1950s, parents told their kids not to listen to Elvis Presley. It’s ridiculous to tell college students not to use Wikipedia. They all use it. Educators shouldn’t abandon their responsibility to help students cope with the world in an adult manner. They should teach students to critically judge sources. They should teach about how Wikipedia is created and its strengths and weaknesses. And they should tell students when to use an encyclopedia versus when to step into the primary literature.

Encyclopedias give you fast, accurate background information. If you’re reading a novel and come across a term you don’t know – for example, a novel set in World War II mentions the Battle of the Bulge – go to an encyclopedia to look it up. If you’re writing a paper on the Battle of the Bulge, Britannica or Wikipedia is not what you should be using. Read the article to get your bearings, but then do your homework.

Rock and roll will never die. Wikipedia is not going away, so if you tell your students not to use it, you’re being unhelpful to them.

TR: How will you change Wikipedia to ensure higher-quality articles?

JW: Very soon in the German Wikipedia, there’s going to be an experiment with stable versions. Trusted contributors will be able to identify work as being accurate and peer-reviewed, then set aside those articles so they can’t be edited. How it’s going to be reviewed and the level of quality – the community needs to figure that out. Other versions will still be available for editing.

The reason we’re doing this is that, in some cases, particularly with articles that are frequently vandalized, there is a good version of the article that people keep messing up. By doing this, we can extract out our own peaks.

TR: Wikipedia erects a technological barrier between its information and its users. There are many people with expertise who don’t contribute because they don’t use computers, can’t afford Internet access, or don’t understand how to use your software; and there are also end-users who can’t access Wikipedia for the same reasons. If your goal is a high-quality encyclopedia that’s available to everyone, how will you overcome this problem?

JW: You’re right. Our mission has always been to provide a freely licensed encyclopedia to every single person in their own language. If you have a broadband connection and you speak English or a whole host of European languages, we’re doing a good job. If you don’t have a computer or don’t speak one of those languages, we haven’t achieved anything.

There is a technical barrier – plenty of people find the software intimidating. You click edit, and while most times you just see sentences, sometimes you see formatting codes.

Even outside of that, there’s a techno-social barrier. For some people, the act of participating is a barrier. I’ve met people who say they think everyone on Wikipedia sounds smart. They want to add something, but feel they don’t know enough. In some cases, they don’t know enough, but in some cases, they really should contribute.

TR: How will Wikipedia overcome its biases: toward science over the humanities, the present over the past, Western over global issues?

JW: We have a variety of systemic biases, not in the sense of one-sided articles, but in that we write about what interests us. We have a fantastic article about the USB standard, but not much about the Congo Wars – because we’re Internet geeks. That problem has gotten better as we’ve grown and become better known. We’ve come out of the core free software movement and now have diverse contributors. In order to improve, we’ll make the software easier to use. We also have projects within the community that identify the systematic biases and look for people to help. There’s no magic answer. We have to find the right people to help us.

TR: Why do you think that some people don’t trust Wikipedia, and what will you do to convince them?

JW: There are two kinds of people who don’t trust Wikipedia: the reasonable and the unreasonable. Time makes more converts than reason, so the unreasonable will come around eventually. The reasonable people use Wikipedia and find it useful and valuable, but are cautious about what they’re looking at. If anything seems strange or has a neutrality notice [a tag on an article indicating it is written from a non-neutral point of view], they’re very cautious. That’s perfectly reasonable. Anyone who says they don’t use Wikipedia because it’s written on the Internet is making a mistake.

Keep Reading

Most Popular

DeepMind’s cofounder: Generative AI is just a phase. What’s next is interactive AI.

“This is a profound moment in the history of technology,” says Mustafa Suleyman.

What to know about this autumn’s covid vaccines

New variants will pose a challenge, but early signs suggest the shots will still boost antibody responses.

Human-plus-AI solutions mitigate security threats

With the right human oversight, emerging technologies like artificial intelligence can help keep business and customer data secure

Next slide, please: A brief history of the corporate presentation

From million-dollar slide shows to Steve Jobs’s introduction of the iPhone, a bit of show business never hurt plain old business.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.