Skip to Content

Like most search engines, Google stores huge quantities of data about logged-in users’ past searches; it uses the data to tweak a set of algorithms that deliver “more relevant” search results. But a recent study questions how useful these personalized results really are.

Big thinkers: Researchers tested Google’s personalization algorithms by building up search histories based on books by well-known philosophers.

“At the end of the day, the tradeoff is not good,” says Martin Feuz, a researcher at the Centre for Cultural Studies at the University of London, who was involved with the work. “We’re giving up too much [personal] information and not getting high-enough-quality results.”

Together with Matthew Fuller, another researcher at the Centre for Cultural Studies, and Felix Stalder, a lecturer in digital culture and network theories at the Zurich University of the Arts, Feuz created dummy Google accounts for three famous philosophers—Immanuel Kant, Friedrich Nietzsche, and Michel Foucault. They built up a fake Web history for each profile by searching Google using terms collected from each philosopher’s books. Feuz admits that these profiles aren’t likely to reflect the average user’s search terms, but he argues that the project still provides insight into how Google personalizes results.

The researchers used each profile to perform a set of test searches. They used three sets of terms—one associated with interests shared by all three philosophers, another created from popular tags on the social-bookmarking service Delicious, and a third made of phrases gleaned from several books. They then compared the results for all three profiles with the results produced through anonymous searching (performed without being logged in to a Google account).

The researchers found that personalized results appeared about half the time. Those results were significantly different from what the anonymous user saw—in one case, more than six of the top 10 results looked different. However, in many cases they found that the changes did not reveal any new content: about 37 percent of the personalized results simply involved moving links from the second page of results to the first. And only about 13 percent of the personalized results came from beyond Google’s first 1,000 links.

Finally, the researchers found that Google was giving personalized results even in cases where there was no clear relationship between the search query and the user’s Web history. They suspect this means that Google uses Web history to assign users to demographic categories and adjust results accordingly. Feuz also says he is concerned that Google is altering the information that users see without making it clear to them that anything is happening.

Ethan Zuckerman, a researcher at the Berkman Center for Internet and Society at Harvard University, says the work provides useful empirical insight into Google’s personalization methods. He notes that the company has to keep its algorithms obscure because an entire industry is devoted to gaming the system in order to make money from search-related advertising.

Zuckerman is also concerned that Google doesn’t make clear how the average user’s access to information is modified. He notes that the algorithms might have been adapting to the researchers even as they tried to pin down Google’s behavior. “With personalization, we are studying something that’s deeply unstable,” he says. 

“The big challenge for Google is that they have so much baggage around their existing algorithm,” says David Schairer, CTO and cofounder of TrapIt, an artificial-intelligence startup that aims to help people find relevant information online. Whether using social-graph personalization or traditional search, “popular or higher-rated content tends to be self-perpetuating,” Schairer says. This makes it hard for more obscure but high-quality content to be seen.

Personalization is part of Google’s effort to broaden the social understanding of its search engine. Last week, the company introduced “+1,” a service that allows users to recommend links and content to people they know. Feuz says +1 could also increase the amount of unique content available in top results. “A +1 signal from a person within a user’s social network might give them more confidence in raising a document from far below search-result position 100 to the first 10 or so,” he says.

Feuz says that he would like to see Google indicate which results are personalized and give users the ability to toggle between personalized results and standard results, so they can see how the algorithms affect what information is available to them.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.