A spin-out company from the University of Oxford called Oxbotica has developed a new software system for making regular cars into driverless vehicles.
The system, called Selenium, can ingest data from visual cameras, laser scanners, or radar systems. It then uses a series of algorithms to establish where the "it" is, what surrounds it, and how to move. “It takes any vehicle and makes it into an autonomous vehicle,” explains Paul Newman, a professor at the University of Oxford and cofounder of Oxbotica. That sounds ambitious, but he’s being serious: the team plans for the software to be used to control not just autonomous cars, but warehouse robots, forklifts, and self-driving public transport vehicles.
Most systems being developed by other manufacturers rely on building a system that is robust enough to handle driving from the moment they are first switched on. Tesla’s Autopilot, for example, makes use of onboard cameras and image analysis to control the car on highways. But its reliability has come into question after a series of recent crashes.
Oxbotica’s software gradually acquires data about the routes along which a vehicle is driven and learns how to react by analyzing the way its human driver acts. “When you buy your autonomous car and drive off the (lot), it will know nothing,” says Ingmar Posner, an associate professor at Oxford and another of Oxbotica’s cofounders. “But at some point it will decide that it knows where it is, that its perception system has been trained by the way you’ve been driving, and it can then offer autonomy.”
The company explains that the software provides two primary functions: localize the vehicle in space, and perceive what’s happening around it. Based on those two feeds, a central planner can determine how the car should move. Both localization and perception systems rely on sensors dotted about the vehicle, the choice of which depends on application: Newman suggests that a warehouse forklift may use just use cheap cameras, while a car could make use of all kinds of sensors.
Selenium can compare on-the-fly sensor readings with those stored away in prior maps from previous journeys in similar conditions. “If you take it out in the snow and it’s not seen it before, it keeps the ideas of snowy-ness around for the next time,” Newman says. Then it can identify image features—such as details on buildings or placement of street furniture—to localize the vehicle in the wider world. Meanwhile laser data, due to its high resolution, can be used to more accurately localize the car, especially in low-visibility conditions when cameras can falter.
The team initially teaches Selenium to recognize, say, cars and humans by providing it with a labeled training set from which it can learn. But over time it also learns from the driver. “If a human’s driving and they pass straight through what the car thought was a human, the software can learn from that,” Posner says. The system uses similar prior knowledge and continual learning to work out, for instance, which parts of a surface it can safely drive on or how traffic signals are changing.
The result is a vehicle that can gain a deep understanding of the routes it drives regularly. That, Posner says, means that the software isn’t simply trying to do a mediocre job wherever it’s placed—instead, it does an excellent job where it’s learned to drive. I took a spin in a modified Renault Twizy that had been fitted with lasers, cameras, and a large computer powered by Selenium. It felt much like being driven by a confident human driver, with smooth but assertive acceleration, braking, and steering—though there were no hazards on the simple loop we drove.
Oxbotica’s software is set to be tested in two real-world settings in the near future: in the self-driving public transport vehicles of the GATEway project in Greenwich, London, and the LUTZ Pathfinder driverless pods being tested in Milton Keynes, U.K. When pressed, Newman explained that the company is already working with auto manufacturers, though he wasn’t able to say which ones, nor say when the technology might be rolled out in cars.
Despite recent investigations into Tesla’s autonomous systems casting somewhat of a shadow over self-driving car technology, those working in the sector are clearly pressing ahead with their work. Oxbotica isn’t alone in launching software: Nissan also announced its new ProPilot driver aid this week, too. The two systems are very different, but their arrivals suggest that the race to achieve automotive autonomy shows no signs of slowing.
Here’s how a Twitter engineer says it will break in the coming weeks
One insider says the company’s current staffing isn’t able to sustain the platform.
Technology that lets us “speak” to our dead relatives has arrived. Are we ready?
Digital clones of the people we love could forever change how we grieve.
How to befriend a crow
I watched a bunch of crows on TikTok and now I'm trying to connect with some local birds.
Starlink signals can be reverse-engineered to work like GPS—whether SpaceX likes it or not
Elon said no thanks to using his mega-constellation for navigation. Researchers went ahead anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.