# Web

## Wolfram Alpha and Google Face Off

Our exclusive test shows how the two Web engines compare when given the same queries.

*(See answers to readers’ queries, submitted in response to this article, **here**.)*

Last week, as physicist Stephen Wolfram was demonstrating his new Web-based “computation engine”–Wolfram Alpha–to the public, Google announced a data-centric service of its own. Alpha accesses databases that are maintained by Wolfram Research, or licensed from others, and deploys formulas and algorithms to compute answers for searchers.

Using some prelaunch log-in credentials provided by the Wolfram team, I decided to run my own Wolfram Alpha versus Google test. I used a handful of search terms that could produce data-centric answers and tried variations in a few cases to see what might happen.

This was an effort to get beyond the characterizations and produce some real data. I also wanted to explore the claims made during my visit to Wolfram Research last week: that Alpha can add unique value in computing answers based on your search queries.

Here’s what I entered, and what I found.

SEARCH TERM: Microsoft Apple

WOLFRAM ALPHA: I got side-by-side tables and graphics on the stock prices and data on the two companies, plus a chart plotting the price of both stocks over time.

GOOGLE: The top hits were mostly news stories, from major and minor publications, containing both words.

VARIATION: When I changed the Google search term to just “Microsoft” or just “Apple,” I got a chart with today’s stock price up top; when I clicked that link, I received tons of information–comparable to what Alpha provides–but only on the single company.

SEARCH TERM: Sydney New York

WOLFRAM ALPHA: I got tables showing the distance between the two cities in miles, kilometers, meters, even nautical miles; a map of the world with the optimal flight path; and the fact that the trip spans 0.4 of the earth’s circumference. I learned how long it would take to make the trip: 18.1 hours flying; 13 hours for a sound wave, 74 milliseconds for a light beam in fiber, and 53 milliseconds for a light beam traveling in a vacuum. I also got comparative populations, elevation in meters, and current local times.

GOOGLE: I got a mix of things: a form for finding flights between Sydney and New York; a Google Maps-plotted list of businesses in New York City that contain the word “Sydney”; and links to the municipal government of Sidney, a small town in upstate New York.

VARIATION: When I tried “Sydney New York distance” (adding the word “distance”), Wolfram gave me only the distance information mentioned above while Google gave me links to distance-finding websites. I opened the first one, was able to enter “New York” and “Sydney” in some forms, and wound up with much the same information provided by Wolfram (but without the light-beam and sound-wave details).

SEARCH TERM: 10 pounds kilograms

WOLFRAM ALPHA: The site informed me that it interpreted my search term as an effort to multiply “10 pounds” by “1 kilogram” and gave me this result: 4.536 kg2 (kilograms squared) or 22.05 lb2 (pounds squared).

GOOGLE: Google gave me links to various metric conversion sites.

VARIATIONS: Adding the word “in” changed everything. When I tweaked the search query to say “10 pounds in kilograms,” the Wolfram site gave me the correct conversion: 10 pounds equals 4.536 kilograms. It also gave me the volumes (in various units) of 10 pounds of water. In a final, somewhat cheesy touch, it also told me that 10 pounds was 1.8 times the weight of Wolfram’s book, *A New Kind of Science*. In Google’s case, this revised search term produced the helpful calculated result up top: 10 pounds = 4.5359237 kilograms.

When I put in “10 lbs kgs,” Alpha gave me the calculated result (the assumption was that I wanted multiplication), as it had with the full words. Google gave me metric conversion sites–the top one was a “Russian Brides Cyber Guide.” (It offers both brides and metric conversions.)

When I tried “10 pds kgs,” Alpha choked and didn’t understand. Google helpfully asked if I meant “pounds” and gave me metric conversion sites, but not the calculated result.

SEARCH TERM: light bulb

WOLFRAM ALPHA: I was expecting some facts and figures on this ubiquitous technology but got a message saying that Wolfram Alpha “isn’t sure what to do with your input.”

GOOGLE: I got several links–starting with a Wikipedia entry–explaining what a light bulb is and providing some history.

VARIATIONS: When I tried “light bulb inventor,” I got similar results: Alpha drew a blank, but Google gave useful links. When I tried “first light bulb,” Alpha provided a table explaining that the light bulb was patented in 1878; under “people involved,” it cited Thomas Edison.

SEARCH TERM: Aspirin Tylenol

WOLFRAM ALPHA: Alpha gave me molecular diagrams for aspirin and acetaminophen and lots of scientific information comparing their molecular weights, boiling points, vapor pressure, and so forth.

GOOGLE: Usefully (to nonchemists suffering from headaches), the top link was to a Wiki-answers page telling people whether they can take aspirin and Tylenol together. Other links gave information about toxicity, danger to kidneys, and the like.

SEARCH TERM: Stanford Harvard

WOLFRAM ALPHA: I got tables comparing data from the two schools: size of student bodies–broken down by full-time, part-time, undergraduate, and graduate–plus the number of undergraduate, master’s, and doctoral degrees awarded, and similar data. Alpha listed Stanford’s tuition as $25,000, which is incorrect, and no tuition for Harvard. As with all of Alpha’s results, it gave me sources against which to check the information.

GOOGLE: Google gave me a collection of links (starting with a discussion board for students trying to make a college decision) and various news stories containing the two terms.

SEARCH TERM: Cancer New York

WOLFRAM ALPHA: I was expecting statistics on cancer rates in New York. Instead, the Wolfram site assumed I meant the constellation. It showed me where Cancer could be found in the night sky viewed from New York, told me when it would next rise and set, and included a map of the night sky.

GOOGLE: The first link was to Memorial Sloan-Kettering Cancer Center in New York. The second was to the New York State Department of Health’s cancer page. The third was to the New York State Cancer Registry. Not bad.

VARIATIONS: Adding a second state (Cancer New York Nevada) confused Wolfram–it didn’t know what I wanted. With Google, all the top results were Nevada-centric: a mix of news stories, lawyers’ websites, and medical centers relating to cancer (the disease) in Nevada. No comparisons, no data, and not as helpful as it was when I just put “Cancer New York.”

SEARCH TERM: Utah Florida population

WOLFRAM ALPHA: Alpha gave me tables containing the two states’ populations from 2006, the population growth rate from 2000 to 2006 (including a chart that I could download), and the number of annual births and deaths in 2004.

GOOGLE: Even though Google just launched a new data-presentation service with access to public census and labor data, this search term did not bring me to the new data service. The first hit was to a U.S. census press release that itself contained links to population tables.

VARIATIONS: When I tried “Utah population,” Google did give me a view of its new service: a simple chart of Utah’s population from 1980 to the present.

When I changed the search term to “Utah Florida,” Wolfram threw the almanac at me, giving side-by-side tables on population data plus high and low elevations of the two states, the dates that the states joined the union, the area of farmland, the household income and poverty rates, and so on. Google gave me random sites that contained the two words, starting with a mapped location of a business in Lake Mary, Florida, that contains the word “Utah.”

Generally, I did not use search terms that clearly had no computable answer (and therefore would have stumped Wolfram). But I also didn’t throw any softballs in areas close to the heart of its makers: physics, chemistry, engineering, and genomics. On hard-core scientific questions, it gives you tons of symbols and graphics and other information that would be useful to a researcher but obscure to most people. But on many common questions for which there is no obvious data element, you will not get much help. In any event, if its plans hold, you should be able to test it out yourself in two or three weeks.