Google Tweaks Search to Challenge Apple’s Siri
Upgrades to Google’s search engine will make it better at understanding conversational queries – helping its mobile search apps tread on Siri’s toes.
Making search better at understanding natural language and speech could allow major changes in how people use mobile devices.
Google announced a series of upgrades to its search engine and mobile search apps today that strengthen its ability to understand queries in the form of natural sentences like those used in conversation. The changes are particularly focused on enabling more complex spoken interactions with Google’s mobile apps, boosting the company’s challenge to Apple’s Siri personal assistant.
“We are making your conversation with Google more natural,” said Amit Singhal, who leads search technology at Google. He spoke at a press conference held in the Menlo Park garage that Google cofounders Larry Page and Sergey Brin made their first office space in 1998.
The new features apply to all Google searches but were all demonstrated with queries spoken out loud to Google’s mobile apps. One change makes Google better at understanding broad questions about categories of concepts. For example, saying “Tell me about Impressionist artists” to the Google search app on a mobile device or tablet calls up a page that presents many ways to explore the topic. A carousel of images at the top of the page allows a person to swipe through different artists, and tapping one leads to another summary page with a carousel of works from that artist. Asking Google about a band brings up a list of its songs to hear. Movies and many other topics can be explored in the same way.
Another upgrade gives Google the ability to compare different things or concepts. For example, asking the search app to “compare coconut oil and olive oil” produces a table contrasting their nutritional qualities. Google selects the most relevant criteria to compare things. Ask for a comparison of two celestial bodies, dfor example, and it will consider properties such as brightness, age, weight, and orbital period.
Google’s new features rest on a system called Knowledge Graph, which the company unveiled last year. It gives the company’s software the ability to understand the meaning, concepts, and relationships behind text that mentions concepts and things (see “Google’s New Brain Could Have a Big Impact”).
Tamar Yehoshua, vice president for search at Google, also demonstrated an upgraded version of Google’s search app for Apple devices. “We have made voice a much bigger feature,” she said. The changes puts it into even more direct competition with Siri, which is promoted as a personal assistant people can talk to like a real person.
One new feature of the upgraded iOS app makes it possible to ask the app to remind you of something when you get to a specific location. If you tell it to “remind me to get crackers when I go to Safeway,” the app will confirm which store you mean and then notify you the next time you visit that location.
Singhal also announced that roughly one month ago, his team had made a complete overhaul of Google’s core search ranking system to improve its ability to handle longer, more conversational queries. The upgraded system, known as Hummingbird, replaces one known as Caffeine, which had been used since 2010. About 90 percent of Google searches have been affected by the change.
“People have started asking many more complex questions of Google, and our algorithm had to go through some fundamental rethinking,” said Singhal. The changes were focused on improving Google’s ability to understand the concepts a person refers to in a query and how they are related, he said. “You have to balance all that meaning of what the query is looking for with what the Web document is saying.”
Learn from the humans leading the way in intelligent machines at EmTech Next. Register Today!
June 11-12, 2019