Jaguar Demos a Car That Keeps an Eye on Its Driver
A company called Seeing Machines wants to use cameras and software to make sure you’re focused on driving.
Distracted drivers kill nearly 10 people each day in the United States and injure 1,150 more, according to the Centers for Disease Control and Prevention.
Many cars already include plenty of sensors—cameras for spotting objects in your blind spot, for instance—but they’re usually keeping an eye on the outside world, not on what’s going on behind the wheel.
An Australian company called Seeing Machines is turning sensing inward with technology that focuses on drivers themselves in hopes of reducing distracted and drowsy driving. The company is using cameras and software to detect eye and facial movements so it can alert drivers who have become inattentive, either due to drowsiness or distraction. This kind of technology is set to become more common, especially as cars become more capable of driving themselves on some stretches of road.
So far, the company has focused its technology on drivers of heavy industrial trucks used in the mining industry, but it’s also moving toward the consumer market: it has a deal with auto parts maker Takata to bring its technology to cars and other vehicles. At the International Consumer Electronics Show in Las Vegas this week, Seeing Machines will demonstrate its attention-monitoring sensors in the dashboard of a Jaguar F-Type.
Built-in systems for tracking drivers’ attention are an option on a still small but growing number of cars from companies such as Lexus. Some companies, like Google, paint a future where cars are completely automated.
Seeing Machines’ technology uses a small infrared camera fitted to the dashboard that works with software running on the vehicle—not on a remote server—to evaluate whether the driver is looking at the road. It evaluates a person’s head position, facial expression, and blinking rate. The camera captures 60 frames per second, and the software analyzes the images to determine the driver’s alertness.
Nick Langdale-Smith, Seeing Machines’ vice president for company partnerships, says the company can get a good read on a driver’s state by tracking his pupils in particular. The software measures eyeball rotation and detects where the driver’s line of sight intersects with objects around the driver. This allows the software to determine the amount of time a driver spends looking at the dashboard, car mirrors, road, and elsewhere—which can help it judge whether a driver is paying attention to traffic or starting to doze off.
If the system senses you’re not paying attention, it will alert you to put your eyes back on the road or pull over. For companies that are already using the system in their trucks, this comes via a seat-based buzz, though Langdale-Smith says this won’t be the case in an eventual consumer version of the technology. “This is there to save your life,” he says.
To date, Seeing Machines has honed its technology in the mining industry, where Caterpillar and other makers of huge vehicles that transport earth and minerals are using it to monitor drivers. “The shift is long and the task is boring,” says Langdale-Smith. “And when [drivers] fall asleep, the vehicles turn into 450-ton juggernauts.”
Seeing Machines’ deal with Takata, announced in September, will include the installation of driver-monitoring systems in cars made by an unnamed major automaker—it’s not clear when this will happen, though.
The AI revolution is here. Will you lead or follow?
Join us at EmTech Digital 2019.