The new feature will be available when a user clicks the “Latest results” tab on Google searches. It will be available immediately in English-language countries, but will soon be expanded to other languages, the company says. Searchers will see updates from popular social sites such as Twitter and Friendfeed, and headlines from news sites. Visiting Google Trends and clicking on a “hot topic” will reveal a search results page showing the most popular real-time information.
Other search engines are working to make their results just as fresh. Bing includes some recent results in its search returns, and the newcomer Cuil launched streaming results last month. “It is a good thing to see Google innovate on their search page thanks to competition brought on by other search engines like Bing and Cuil,” said Seval Oz Ozveren, VP for business development at Cuil.
The visual search tool, released in Google Labs, lets users take a photo of a landmark or a store sign, for example, and then searches billions of images for matches, and for Web pages providing relevant information. However, this feature will not include face-recognition software until Google devises a system to protect privacy. “We have decided to delay that until we have more safeguards in place,” says Vic Gundotra, Google’s vice president for engineering.
Dan Weld, a computer scientist and search researcher at the University of Washington, tested the visual search technology and pronounced it “pretty darn cool.” He says that it recognized a can of Diet Dr Pepper and found relevant search returns. And, after initially drawing a blank on a bottle of Lipton Iced Tea, it recognized it with a closer-up shot, and delivered good search results.
Weld suggests that the technology works by doing optical-character recognition of the words, rather than of the labels itself, since at one point it caught the letters “API” from a label and gave him search results for “application programming interface”. The technology also recognized the Seattle space needle and gave him tourist websites. “Not a formal evaluation, but it’s pretty neat,” he says. “And it seems like it has the potential to be a huge opportunity for them if it takes off.”
With the convergence of billions of mobile networked devices, powerful cloud computing resources, and ubiquitous sensors like cameras and GPS chips, “it could be that we are on the cusp of a new computing era,” Gundotra added. “Take the camera and connect it to the cloud, it becomes an eye. The microphone connected to the cloud becomes an ear. Search by site, search by location, search by voice.”
Hear more from Google at EmTech 2014.