Startup Uses Computer Vision to Make Augmented Reality in Cities More Precise
Making virtual images look good against a real-world backdrop is not easy, but this might be a good fix.
Recommended for You
If you’ve ever played around with an augmented-reality app like Pokémon Go on a smartphone, you know that even though it can be fun, the virtual images you see through the eye of your phone’s display often don’t look quite right against the real world in the background.
Part of the problem is that AR apps tend to depend on a combination of your phone’s GPS and compass to figure out where you are and, thus, where to show these images on your screen, yet GPS doesn’t always work that well if you’re wandering around a bustling city. This can result in virtual objects that appear jittery and out of place, rather than nicely juxtaposed with your surroundings.
An augmented reality startup called Blippar is working on a different method that it thinks can lead to much better-looking AR apps: employing computer vision to help figure out where you are and what direction you’re facing, relative to the busy urban space around you, in a way that it says is often more accurate in cities than GPS.
What would you want to use augmented reality for?We'd love to know.
Omar Tayeb, a Blippar cofounder and its chief technology officer, says the startup is doing this by licensing a trove of image data for major cities—essentially, another company’s version of Google Street View (it won’t say which company it’s going with). It indexes the pictures and then matches them up with what a smartphone user sees through the phone’s camera, in order to find the closest match to their location (the phone may also use GPS or cell-tower triangulation to determine where you are; that’s not up to Blippar, he says). So far, he says, the company has tested it out in San Francisco, London, and Mountain View, California.
With the city image data it’s using, Tayeb says, Blippar has the ability to get pictures of buildings from different angles, which help determine how far away you are from one and from what angle you’re looking at it. This also more precisely determines where a virtual sign or other image should go.
You can get a sense for how this could look in a video shot on an iPhone that’s running a prototype app that Blippar is using internally.
The graphics in the video look rough: cyclists ride through bands of color overlaid on the street, and the sandwich board floats weirdly above the ground.
But the images do appear swiftly, and their locations seem to make sense. Tayeb says that Blippar’s method of location estimation is accurate within eight meters on average, and in most cases under three meters. GPS on a phone is typically accurate within five meters if you’re in an open area; it gets worse when you’re in a place with lots of buildings, trees, and so on.
Tayeb says Blippar plans to release Android and iPhone apps publicly in the next three months that will demonstrate how its positioning technique could work with content like real estate listings or restaurant reviews overlaid on real-world locations; eventually it hopes that other companies will license the technology for their own apps, too.
Be the leader your company needs. Implement ethical AI.
Join us at EmTech Digital 2019.