A research project called Bing Now, demonstrated at Microsoft’s headquarters last week, could give Web searchers a way to gauge the current vibe of a bar or restaurant before they book a table.
Smartphones can already deliver directions to a restaurant, endless reviews, and other static information. But without making a phone call, or actually going, there’s no means of knowing if a place is busy and playing loud rock music or empty and playing quieter tunes. Microsoft researchers think that smartphone owners who are already there could collect this kind of up-to-the-minute information.
“Every time a user checks in to a business, he actually explicitly tells us where he is going, and we actually know the phone is in the user’s hands,” says Dimitrios Lymberopoulos, who is in the sensing and energy group of Microsoft Research. When a person checks in, as a Foursquare user would, the phone could collect 6- to 10-second audio samples, process them on the device, and send off the extracted data. From this information, software models developed by Microsoft’s researchers could tease out the size of the crowd, the level of chatter, and the music volume and classify them as “low,” “normal,” or “high.” The app could even tell a searcher what song is playing.
At the moment, Bing Now is only a research project (see video demo here). But as companies like Google, Yelp, and many startups all try to position themselves as the go-to way to find local businesses, it’s not hard to imagine how the idea could help Microsoft promote its Bing search engine and become a bigger presence on mobile devices.
The Bing Now app would allow for new kinds of search queries. For example, a person who wants to conduct a sensitive business lunch could search for a list of local restaurants with high chatter levels.
And if enough people check in to different businesses, Lymberopoulos imagines, this data could also be incorporated into more general local search queries. The classification algorithms—for example, one that separates chatter from background music—had an 80 percent accuracy rate at 150 businesses in the Seattle area, he says.
Of course, it’s not clear how broadly useful such a tool would be, given that people don’t normally search by these sorts of criteria today. And some users may find the idea creepy. The project is, however, one of a growing number of ways smartphone sensors are being used to compile useful information that reflects current conditions at a location.
“It’s really interesting. I think it’s building off a general interest in crowdsourcing information in real time from people out in the world,” says University of Rochester computer scientist Jeffrey Bigham, who works with similar methods. Bigham notes, though, that there could be other ways to collect the same data, such as simply asking users to share their impressions of a restaurant’s atmosphere when they check in.
Today, Google Maps relies partly on mobile device location data to estimate the traffic conditions along a traveler’s route and predict better arrival times. And a company called PressureNet is working on creating better hyperlocal weather readings by compiling data from pressure sensors now starting to appear on Android devices (see “App Feeds Scientists Atmospheric Data from Thousands of Smartphones”). Others, says Bigham, are trying to use Twitter to gather crowdsourced information about wait times in airport security lines. If wearable computing devices like Jawbone or Google’s upcoming Glass become common, a person wouldn’t even have to take out a device and explicitly check in.
As these experiments become more common, some researchers are even contemplating a whole new kind of search engine—a real-time one that searches data from networks of sensors, including those in mobile devices.
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
We can’t afford to stop solar geoengineering research
It is the wrong time to take this strategy for combating climate change off the table.
Meet Altos Labs, Silicon Valley’s latest wild bet on living forever
Funders of a deep-pocketed new "rejuvenation" startup are said to include Jeff Bezos and Yuri Milner.
The new version of GPT-3 is much better behaved (and should be less toxic)
OpenAI has trained its flagship language model to follow instructions, making it spit out less unwanted text—but there's still a way to go.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.