The iconic spinning laser sensors atop autonomous cars may be making their final turns. Velodyne, the world’s market-leading lidar manufacturer, has built a new device that sees further and in more detail than any lidar sensor currently on sale, in a package a fifth the size of its previous high-resolution device.
Lidar sensors, which bounce laser beams off nearby objects to create highly accurate 3-D maps of their surroundings, are an important component for most self-driving cars. Until now, the pick of the commercially available crop has been Velodyne’s HDL-64E—a coffee-can-size lump that fires 64 laser beams, one atop the other, as it spins in circles. Each beam is separated from the next by an angle of 0.4°, with a range of 120 meters.
But those specifications aren’t enough to help vehicles at high speed in unpredictable situations. A car is traveling at 70 miles per hour would have just four seconds to respond to an obstacle. The angular resolution is also too low to make out an object that’s far away, because the laser beams will be too spread out to return a viable image. As a result, many autonomous cars use data from other sensors to help recognize obstacles, even if they have lidar sensors on board.
“The consensus is pretty clear,” Austin Russell, CEO of rival lidar manufacturer Luminar and one of our 35 Innovators under 35 of 2017, told MIT Technology Review earlier this year. “You need a lidar that can see out past 200 meters. You also need to be able to see ... not just that there’s an object out there, but what it is.”
Such concerns have pushed the world’s most advanced driverless-car projects to build their own lidar systems. Two of them, Waymo and Uber, are currently embroiled in a heated lawsuit centered on stolen trade secrets related to their in-house devices.
Now, Velodyne has launched a sensor called VLS-128 that it hopes will satisfy the demand for increased performance. The new device uses 128 laser beams, twice as many as its predecessor. The firm’s chief technical officer, Anand Gopalan, told MIT Technology Review that those beams are separated by angles as small as 0.1°, with a range of 300 meters, and create as many as four million data points per second as they spin through 360 degrees. The increase in resolution, says Gopalan, will provide such detail that cars won’t need other sensors for obstacle detection—though they probably will still carry other sensors in the interests of redundancy and safety.
Ingmar Posner, an associate professor of information engineering at the University of Oxford and founder of the university’s autonomous-driving spinoff Oxbotica, says the increase in performance is “awesome.” He agrees that it should allow vehicles to detect objects more reliably using lidar data, and believes it would enable driverless cars to cope better in faster-moving environments.
For once, though, Velodyne is starting to see competition. Russell’s startup, Luminar, has developed its own mechanical lidar system, which uses a single mirror to steer a powerful laser in order to see 250 meters, with beams separated by as little as 0.05° (see “College Dropout Says He’s Cracked Self-Driving Cars’ Most Crucial Component”). And other firms, such as Aeva, are experimenting with systems that blend lidar and radar approaches, to push range and resolution even further.
Still, Velodyne has more experience with actually building and selling such devices, and it says it will ship units of the VLS-128 before 2017 is out. Gopalan wouldn’t say whether its customers included firms that were building their own alternatives, such as Waymo and Uber, explaining only that he would “expect this sensor to meet the needs of any autonomous-vehicle program in the industry, either current or in the future.”
For now, big companies may be the only ones able to afford the sensor. Gopalan wouldn’t say how much it will cost at first, explaining only that it could cost “thousands of dollars” when it enters mass production, though it’s not clear when that will happen. But the device’s predecessor, the HDL-64E, is not mass-produced and costs $80,000 per unit. Gopalan would say, however, that the new sensor will probably be installed in robotic taxis and autonomous trucks at first.
Russell shares that sentiment. “Some of the companies in the ride-sharing space have said that if you had an autonomous car that just worked, they’d be willing to purchase these cars for $300,000 to $400,000 apiece and buy as many as you could possibly make,” he says. That’s because the total vehicle cost—and, therefore, the price of sensors—is less important than it is in a consumer car, because a steep investment could be recouped quickly by keeping a vehicle on the road nearly 24 hours a day.
Meanwhile, many firms, including Velodyne, are busy building low-cost, solid-state lidar sensors for autonomous vehicles. But in truth, they lack the range and resolution required for high-speed driving (see “Low-Quality Lidar Will Keep Self-Driving Cars in the Slow Lane”). Gopalan agrees. He imagines seeing such devices used as driver aids in consumer vehicles, or to augment expensive sensors in fully autonomous vehicles.
Be the leader your company needs. Implement ethical AI.
Join us at EmTech Digital 2019.