This idea resonated with other Web researchers, and in the late 1990s it began to bear fruit. Its first major result was the Resource Description Framework (RDF), a new system for locating and describing information whose specifications were published as a complete W3C recommendation in 1999. But over time, proponents of the idea became more ambitious and began looking to the artificial-intelligence community for ways to help computers independently understand and navigate through this web of metadata.
Since 1998, researchers at W3C, led by Berners-Lee, had been discussing the idea of a “semantic” Web, which not only would provide a way to classify individual bits of online data such as pictures, text, or database entries but would define relationships between classification categories as well. Dictionaries and thesauruses called “ontologies” would translate between different ways of describing the same types of data, such as “post code” and “zip code.” All this would help computers start to interpret Web content more efficiently.
In this vision, the Web would take on aspects of a database, or a web of databases. Databases are good at providing simple answers to queries because their software understands the context of each entry. “One Main Street” is understood as an address, not just random text. Defining the context of online data just as clearly–labeling a cat as an animal, and a veterinarian as an animal doctor, for example–could result in a Web that computers could browse and understand much as humans do, researchers hoped.
To go back to the Web-as-highway metaphor, this might be analogous to creating detailed road signs that cars themselves could understand and upon which they could act. The signs might point out routes, describe road and traffic conditions, and offer detailed information about destinations. A car able to understand the signs could navigate efficiently to its destination, with minimal intervention by the driver.
In articles and talks, Berners-Lee and others began describing a future in which software agents would similarly skip across this “web of data,” understand Web pages’ metadata content, and complete tasks that take humans hours today. Say you’d had some lingering back pain: a program might determine a specialist’s availability, check an insurance site’s database for in-plan status, consult your calendar, and schedule an appointment. Another program might look up restaurant reviews, check a map database, cross-reference open table times with your calendar, and make a dinner reservation.
At the beginning of 2001, the effort to realize this vision became official. The W3C tapped Miller to head up a new Semantic Web initiative, unveiled at a conference early that year in Hong Kong. Miller couldn’t be there in person; his wife was in labor with their first child, back in Dublin. Miller saw it as a double birthday.
Hear more from MIT at EmTech MIT.