Wikipedia and the Meaning of Truth
Why the online encyclopedia’s epistemology should worry those who care about traditional notions of accuracy.
With little notice from the outside world, the community-written encyclopedia Wikipedia has redefined the commonly accepted use of the word “truth.”
Why should we care? Because Wikipedia’s articles are the first- or second-ranked results for most Internet searches. Type “iron” into Google, and Wikipedia’s article on the element is the top-ranked result; likewise, its article on the Iron Cross is first when the search words are “iron cross.” Google’s search algorithms rank a story in part by how many times it has been linked to; people are linking to Wikipedia articles a lot.
This means that the content of these articles really matters. Wikipedia’s standards of inclusion–what’s in and what’s not–affect the work of journalists, who routinely read Wikipedia articles and then repeat the wikiclaims as “background” without bothering to cite them. These standards affect students, whose research on many topics starts (and often ends) with Wikipedia. And since I used Wikipedia to research large parts of this article, these standards are affecting you, dear reader, at this very moment.
Many people, especially academic experts, have argued that Wikipedia’s articles can’t be trusted, because they are written and edited by volunteers who have never been vetted. Nevertheless, studies have found that the articles are remarkably accurate. The reason is that Wikipedia’s community of more than seven million registered users has organically evolved a set of policies and procedures for removing untruths. This also explains Wikipedia’s explosive growth: if the stuff in Wikipedia didn’t seem “true enough” to most readers, they wouldn’t keep coming back to the website.
Wikipedia's Reference Policy
Wikipedia's “No Original Research" Policy
Wikipedia's "Neutral Point of View" Policy
Wikipedia's Policy on Reliability of Sources
Wikipedia's Citation Policy
These policies have become the social contract for Wikipedia’s army of apparently insomniac volunteers. Thanks to them, incorrect information generally disappears quite quickly.
So how do the Wikipedians decide what’s true and what’s not? On what is their epistemology based?
Unlike the laws of mathematics or science, wikitruth isn’t based on principles such as consistency or observability. It’s not even based on common sense or firsthand experience. Wikipedia has evolved a radically different set of epistemological standards–standards that aren’t especially surprising given that the site is rooted in a Web-based community, but that should concern those of us who are interested in traditional notions of truth and accuracy. On Wikipedia, objective truth isn’t all that important, actually. What makes a fact or statement fit for inclusion is that it appeared in some other publication–ideally, one that is in English and is available free online. “The threshold for inclusion in Wikipedia is verifiability, not truth,” states Wikipedia’s official policy on the subject.
Verifiability is one of Wikipedia’s three core content policies; it was codified back in August 2003. The two others are “no original research” (December 2003) and “neutral point of view,” which the Wikipedia project inherited from Nupedia, an earlier volunteer-written Web-based free encyclopedia that existed from March 2000 to September 2003 (Wikipedia’s own NPOV policy was codified in December 2001). These policies have made Wikipedia a kind of academic agora where people on both sides of politically charged subjects can rationally discuss their positions, find common ground, and unemotionally document their differences. Wikipedia is successful because these policies have worked.
Unlike Wikipedia’s articles, Nupedia’s were written and vetted by experts. But few experts were motivated to contribute. Well, some wanted to write about their own research, but Larry Sanger, Nupedia’s editor in chief, immediately put an end to that practice.
“I said, ‘If it hasn’t been vetted by the relevant experts, then basically we are setting ourselves up as a frontline source of new, original information, and we aren’t set up to do that,’” Sanger (who is himself, ironically or not, a former philosophy instructor and by training an epistemologist) recalls telling his fellow Nupedians.
With experts barred from writing about their own work and having no incentive to write about anything else, Nupedia struggled. Then Sanger and Jimmy Wales, Nupedia’s founder, decided to try a different policy on a new site, which they launched on January 15, 2001. They adopted the newly invented “wiki” technology, allowing anybody to contribute to any article–or create a new one–on any topic, simply by clicking “Edit this page.”
Soon the promoters of oddball hypotheses and outlandish ideas were all over Wikipedia, causing the new site’s volunteers to spend a good deal of time repairing damage–not all of it the innocent work of the misguided or deluded. (A study recently published in Communications of the Association for Computing Machinery found that 11 percent of Wikipedia articles have been vandalized at least once.) But how could Wikipedia’s volunteer editors tell if something was true? The solution was to add references and footnotes to the articles, “not in order to help the reader, but in order to establish a point to the satisfaction of the [other] contributors,” says Sanger, who left Wikipedia before the verifiability policy was formally adopted. (Sanger and Wales, now the chairman emeritus of the Wikimedia Foundation, fell out about the scale of Sanger’s role in the creation of Wikipedia. Today, Sanger is the creator and editor in chief of Citizendium, an alternative to Wikipedia that is intended to address the inadequacy of its “reliability and quality.”)
Verifiability is really an appeal to authority–not the authority of truth, but the authority of other publications. Any other publication, really. These days, information that’s added to Wikipedia without an appropriate reference is likely to be slapped with a “citation needed” badge by one of Wikipedia’s self-appointed editors. Remove the badge and somebody else will put it back. Keep it up and you might find yourself face to face with another kind of authority–one of the English-language Wikipedia’s 1,500 administrators, who have the ability to place increasingly restrictive protections on contentious pages when the policies are ignored.
To be fair, Wikipedia’s verifiability policy states that “articles should rely on reliable, third-party published sources” that themselves adhere to Wikipedia’s NPOV policy. Self-published articles should generally be avoided, and non-English sources are discouraged if English articles are available, because many people who read, write, and edit En.Wikipedia (the English-language version) can read only English.
In a May 2006 essay on the technology and culture website Edge.org, futurist Jaron Lanier called Wikipedia an example of “digital Maoism”–the closest humanity has come to a functioning mob rule.
Lanier was moved to write about Wikipedia because someone kept editing his Wikipedia entry to say that he was a film director. Lanier describes himself as a “computer scientist, composer, visual artist, and author.” He is good at all those things, but he is no director. According to his essay, he made one short experimental film in the 1990s, and it was “awful.”
“I have attempted to retire from directing films in the alternative universe that is the Wikipedia a number of times, but somebody always overrules me,” Lanier wrote. “Every time my Wikipedia entry is corrected, within a day I’m turned into a film director again.”
Since Lanier’s attempted edits to his own Wikipedia entry were based on firsthand knowledge of his own career, he was in direct violation of Wikipedia’s three core policies. He has a point of view; he was writing on the basis of his own original research; and what he wrote couldn’t be verified by following a link to some kind of legitimate, authoritative, and verifiable publication.
Wikipedia’s standard for “truth” makes good technical and legal sense, given that anyone can edit its articles. There was no way for Wikipedia, as a community, to know whether the person revising the article about Jaron Lanier was really Jaron Lanier or a vandal. So it’s safer not to take people at their word, and instead to require an appeal to the authority of another publication from everybody who contributes, expert or not.
An interesting thing happens when you try to understand Wikipedia: the deeper you go, the more convoluted it becomes. Consider the verifiability policy. Wikipedia considers the “most reliable sources” to be “peer-reviewed journals and books published in university presses,” followed by “university-level textbooks,” then magazines, journals, “books published by respected publishing houses,” and finally “mainstream newspapers” (but not the opinion pages of newspapers).
Once again, this makes sense, given Wikipedia’s inability to vet the real-world identities of authors. Lanier’s complaints when his Wikipedia page claimed that he was a film director couldn’t be taken seriously by Wikipedia’s “contributors” until Lanier persuaded the editors at Edge to print his article bemoaning the claim. This Edge article by Lanier was enough to convince the Wikipedians that the Wikipedia article about Lanier was incorrect–after all, there was a clickable link! Presumably the editors at Edge did their fact checking, so the wikiworld could now be corrected.
As fate would have it, Lanier was subsequently criticized for engaging in the wikisin of editing his own wikientry. The same criticism was leveled against me when I corrected a number of obvious errors in my own Wikipedia entry.
“Criticism” is actually a mild word for the kind of wikijustice meted out to people who are foolish enough to get caught editing their own Wikipedia entries: the entries get slapped with a banner headline that says “A major contributor to this article, or its creator, may have a conflict of interest regarding its subject matter.” The banner is accompanied by a little picture showing the scales of justice tilted to the left. Wikipedia’s “Autobiography” policy explains in great detail how drawing on your own knowledge to edit the Wikipedia entry about yourself violates all three of the site’s cornerstone policies–and illustrates the point with yet another appeal to authority, a quotation from The Hitchhiker’s Guide to the Galaxy.
But there is a problem with appealing to the authority of other people’s written words: many publications don’t do any fact checking at all, and many of those that do simply call up the subject of the article and ask if the writer got the facts wrong or right. For instance, Dun and Bradstreet gets the information for its small-business information reports in part by asking those very same small businesses to fill out questionnaires about themselves.
“No Original Research”
What all this means is hard to say. I am infrequently troubled by Wiki’s unreliability. (The quality of the writing is a different subject.) As a computer scientist, I find myself using Wikipedia on a daily basis. Its discussions of algorithms, architectures, microprocessors, and other technical subjects are generally excellent. When they aren’t excellent and I know better, I just fix them. And when they’re wrong and I don’t know better–well, I don’t know any better, do I?
I’ve also spent quite a bit of time reviewing Wikipedia’s articles about such things as the “Singularity Scalpel,” the “Treaty of Algeron,” and “Number Six.” Search for these terms and you’ll be directed to Wikipedia articles with the titles “List of Torchwood items” and “List of treaties in Star Trek,” and to one about a Cylon robot played by Canadian actress Tricia Helfer. These articles all hang their wikiexistence upon scholarly references to original episodes of Dr. Who, Torchwood, Star Trek, and Battlestar Galactica–popular television shows that the Wikipedia contributors dignify with the word “canon.”
I enjoy using these articles as sticks to poke at Wikipedia, but they represent a tiny percentage of Wikipedia’s overall content. On the other hand, they’ve been an important part of Wikipedia culture from the beginning. Sanger says that early on, Wikipedia made a commitment to having a wide variety of articles: “There’s plenty of disk space, and as long as there are people out there who are able to write a decent article about a subject, why not let them? … I thought it was kind of funny and cool that people were writing articles about every character in The Lord of the Rings. I didn’t regard it as a problem the way some people do now.”
What’s wrong with the articles about fantastical worlds is that they are at odds with Wikipedia’s “no original research” rule, since almost all of them draw their “references” from the fictions themselves and not from the allegedly more reliable secondary sources. I haven’t nominated these articles for speedy deletion because Wikipedia makes an exception for fiction–and because, truth be told, I enjoy reading them. And these days, most such entries are labeled as referring to fictional universes.
So what is Truth? According to Wikipedia’s entry on the subject, “the term has no single definition about which the majority of professional philosophers and scholars agree.” But in practice, Wikipedia’s standard for inclusion has become its de facto standard for truth, and since Wikipedia is the most widely read online reference on the planet, it’s the standard of truth that most people are implicitly using when they type a search term into Google or Yahoo. On Wikipedia, truth is received truth: the consensus view of a subject.
That standard is simple: something is true if it was published in a newspaper article, a magazine or journal, or a book published by a university press–or if it appeared on Dr. Who.
Simson L. Garfinkel is a contributing editor to Technology Review and a professor of computer science at the Naval Postgraduate School in Monterey, CA.
Be the leader your company needs. Implement ethical AI.
Join us at EmTech Digital 2019.