Skip to Content

Deeper Vision

November 1, 2004

Researchers are making big strides toward low-cost systems that mimic human vision to give machines three-dimensional information about their environments. By building hardware that analyzes corresponding chunks of paired live images in parallel – as the human brain is thought to do – Tyzx, a startup in Menlo Park, CA, is making computerized depth perception fast enough that surveillance devices and robotic vehicles can incorporate it.

Creatures with two forward-facing eyes can perceive depth because their left and right eyes see from slightly different perspectives, in which the displacement of nearby objects is greater than that of distant objects. Using this apparent difference, called parallax, the brain swiftly determines the distance to an object. While a machine equipped with a pair of cameras can also use parallax to see in three dimensions, the amount of computation required to find matching pixels had previously made stereo machine vision impractical for most situations.

Tyzx computer vision experts Gaile Gordon and John Woodfill invented an algorithm to speed the process. Rather than trying to find pixels with the same color and brightness, the algorithm seeks out left-right pairs where there is a similar contrast in intensity between one pixel and its surrounding pixels. The researchers then built an integrated circuit that can search many groups of pixels simultaneously. They gave this chip a pair of “eyes,” and now “the image capture and the stereo computation all happen inside one relatively inexpensive, self-contained platform,” says Tyzx CEO Ron Buck.

Among the company’s early customers are federal security agencies – Buck says he can’t reveal which ones – that are using the technology to track suspicious individuals as they move against changing backgrounds such as crowds.

Keep Reading

Most Popular

It’s time to retire the term “user”

The proliferation of AI means we need a new word.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Sam Altman says helpful agents are poised to become AI’s killer function

Open AI’s CEO says we won’t need new hardware or lots more training data to get there.

An AI startup made a hyperrealistic deepfake of me that’s so good it’s scary

Synthesia's new technology is impressive but raises big questions about a world where we increasingly can’t tell what’s real.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.