A Path to Better Smartphone Cameras
A Canadian startup turns to software to improve the quality and reduce the size of smartphone cameras.
Smartphones are becoming the only or primary camera for many people.
Fitting the hardware for a high-quality camera into a slender smartphone is tricky. The smoother a camera lens, the less distortion it will produce. But if a lens is too small, the effect of any distortion is magnified. As a result, you sometimes see a bulge sticking out of handsets to accommodate a smooth yet sizable lens.
But software may offer a way around this. A Canadian startup called Algolux says that by computationally accounting for imperfections in lenses (or photographers), it can get higher-quality images out of today’s cell phones and eventually make phone cameras thinner and cheaper.
The Montreal-based company, which recently completed a $2.6 million funding round, is testing its technology on a variety of smartphones. Allan Benchetrit, Algolux’s CEO, says the company believes some phone makers will add its software to some handsets next year.
A series of example photos on Algolux’s website show why they’d be interested. There are marked differences between “before” and “after” shots. Smartphone photos corrected for aberrations in a camera’s hardware show sharper spikes on a cactus in one photo, and sharper letters on a building’s sprinkler system hookup in another. Algolux does this by identifying the specific defects in any given camera through a calibration process and inverting them with its software.
The company also has a method for correcting for motion blur, which often occurs when you take photos in low light. To do this, Algolux uses a front-facing camera to grab high-speed video while photos are being shot with the rear camera. Data from the front camera is used for motion tracking and combined with readings from sensors on the phone, such as its accelerometer and gyroscope, to get an overall measurement of how the user moved the camera to create the blur. This information can then be used to determine how the deblurring software should go to work on images taken with the rear camera.
It could be a while before this second tactic is included in any phones, though. Using the front and rear cameras simultaneously to shoot photos and videos isn’t common with existing smartphones. It’s not even possible on Apple devices; Google says it’s up to Android smartphone makers to decide if they want to enable simultaneous camera use on handsets.
And generally, resolving either type of blur with software could eat up a lot of processing power and battery life, says Bruce Hemingway, a senior lecturer in computer science and engineering at the University of Washington who teaches a course on the science and art of digital photography. Already, smartphone camera apps doing heavy computation sometimes tend to pause between taking shots, he says, which people hate.
Still, he says, the company’s sample images look good.
“I think it is feasible,” he says. “We’re on the edge of where this is really effective in, say, cell phones.”