Select your localized edition:

Close ×

More Ways to Connect

Discover one of our 28 local entrepreneurial communities »

Be the first to know as we launch in new countries and markets around the globe.

Interested in bringing MIT Technology Review to your local market?

MIT Technology ReviewMIT Technology Review - logo

 

Unsupported browser: Your browser does not meet modern web standards. See how it scores »

{ action.text }

Like most search engines, Google stores huge quantities of data about logged-in users’ past searches; it uses the data to tweak a set of algorithms that deliver “more relevant” search results. But a recent study questions how useful these personalized results really are.

“At the end of the day, the tradeoff is not good,” says Martin Feuz, a researcher at the Centre for Cultural Studies at the University of London, who was involved with the work. “We’re giving up too much [personal] information and not getting high-enough-quality results.”

Together with Matthew Fuller, another researcher at the Centre for Cultural Studies, and Felix Stalder, a lecturer in digital culture and network theories at the Zurich University of the Arts, Feuz created dummy Google accounts for three famous philosophers—Immanuel Kant, Friedrich Nietzsche, and Michel Foucault. They built up a fake Web history for each profile by searching Google using terms collected from each philosopher’s books. Feuz admits that these profiles aren’t likely to reflect the average user’s search terms, but he argues that the project still provides insight into how Google personalizes results.

The researchers used each profile to perform a set of test searches. They used three sets of terms—one associated with interests shared by all three philosophers, another created from popular tags on the social-bookmarking service Delicious, and a third made of phrases gleaned from several books. They then compared the results for all three profiles with the results produced through anonymous searching (performed without being logged in to a Google account).

The researchers found that personalized results appeared about half the time. Those results were significantly different from what the anonymous user saw—in one case, more than six of the top 10 results looked different. However, in many cases they found that the changes did not reveal any new content: about 37 percent of the personalized results simply involved moving links from the second page of results to the first. And only about 13 percent of the personalized results came from beyond Google’s first 1,000 links.

Finally, the researchers found that Google was giving personalized results even in cases where there was no clear relationship between the search query and the user’s Web history. They suspect this means that Google uses Web history to assign users to demographic categories and adjust results accordingly. Feuz also says he is concerned that Google is altering the information that users see without making it clear to them that anything is happening.

Ethan Zuckerman, a researcher at the Berkman Center for Internet and Society at Harvard University, says the work provides useful empirical insight into Google’s personalization methods. He notes that the company has to keep its algorithms obscure because an entire industry is devoted to gaming the system in order to make money from search-related advertising.

Zuckerman is also concerned that Google doesn’t make clear how the average user’s access to information is modified. He notes that the algorithms might have been adapting to the researchers even as they tried to pin down Google’s behavior. “With personalization, we are studying something that’s deeply unstable,” he says. 

“The big challenge for Google is that they have so much baggage around their existing algorithm,” says David Schairer, CTO and cofounder of TrapIt, an artificial-intelligence startup that aims to help people find relevant information online. Whether using social-graph personalization or traditional search, “popular or higher-rated content tends to be self-perpetuating,” Schairer says. This makes it hard for more obscure but high-quality content to be seen.

Personalization is part of Google’s effort to broaden the social understanding of its search engine. Last week, the company introduced “+1,” a service that allows users to recommend links and content to people they know. Feuz says +1 could also increase the amount of unique content available in top results. “A +1 signal from a person within a user’s social network might give them more confidence in raising a document from far below search-result position 100 to the first 10 or so,” he says.

Feuz says that he would like to see Google indicate which results are personalized and give users the ability to toggle between personalized results and standard results, so they can see how the algorithms affect what information is available to them.

3 comments. Share your thoughts »

Credits: Martin Feuz, Matthew Fuller, Felix Stalder

Tagged: Web, Google, search engine, algorithms, personalization

Reprints and Permissions | Send feedback to the editor

From the Archives

Close

Introducing MIT Technology Review Insider.

Already a Magazine subscriber?

You're automatically an Insider. It's easy to activate or upgrade your account.

Activate Your Account

Become an Insider

It's the new way to subscribe. Get even more of the tech news, research, and discoveries you crave.

Sign Up

Learn More

Find out why MIT Technology Review Insider is for you and explore your options.

Show Me