Yesterday, a Volkswagen Passat drove around a parking lot in Mountain View, CA, made three-point turns, and followed the rules at a four-way stop–all without human intervention. The computer-controlled car is named Junior, and it’s Stanford University’s official entry in the DARPA Urban Challenge, a race in which an autonomous car must navigate city streets, obey traffic laws, avoid obstructions, and, crucially, drive well among other cars in traffic. This test run is Junior’s first public appearance, designed to let DARPA (the Defense Advanced Research Projects Agency) test the car and determine if it proceeds to the next round in the Urban Challenge.
The motivation for the Urban Challenge is to build a better car. “As we all know, cars are unsafe,” says Sebastian Thrun, the team leader and a professor of computer science at Stanford. Car accidents kill roughly 42,000 people a year in America and about a million people worldwide, he says. In addition, cars are inefficient, causing traffic jams and requiring people to consistently focus on the road during long commutes. The Stanford researchers’ goal is to design a car that can drive itself, conceivably making roads safer and giving people back their time. “The idea of a self-driving car, in my opinion, will change society,” says Thrun.
Two years ago, the Stanford team, using a computerized car named Stanley, won DARPA’s Grand Challenge, an autonomous-vehicle race in the desert. That car had a number of GPS sensors and lasers, a camera, and other equipment to help it make its way through the course. Junior is based on the same fundamental technology, says Thrun, but with some crucial improvements.
Junior uses the same kind of laser perception as Stanley, but with longer range. The new car has a total of eight LIDAR systems that emit beams of light and detect reflections to determine the distance of other objects. One system is mounted on the front of Junior’s roof and has a range of about 100 meters–many times that of Stanley. Another LIDAR system points at the ground and constantly keeps track of the road and reflective lane markers. A third system constantly takes a 360-degree image of its surroundings. All this data is process by two Intel quad-core machines running at 2.3 gigahertz, and the pertinent information is relayed to the driving systems, which guide the car.
View images of Stanford's autonomous car, named Junior.
Junior is also equipped with a precise location system that include GPS and other sensors that measure the revolution of the wheels and the direction the car is moving in. Together, these sensors allow Junior to pinpoint its location to within 30 centimeters.
Importantly, says Thrun, Junior has a lot more “intelligence” than Stanley so that the computer can deal with intersections and traffic. Such tasks simply weren’t a part of the previous race, which basically involved driving down a curvy desert road. This intelligence comes in the form of about 500 different probabilistic algorithms that process all the environmental information collected by the sensors and make the decision that is most likely to be the best. Thrun says that these decisions are made in less than 300 milliseconds, which is sufficient for slowing down or changing lanes if a car in another lane tries to merge into Junior’s. “In the last race, you basically only had to decide whether to speed up or slow down,” says Thrun, “but this time there are discrete decisions on top of that.”
In Thursday’s test, Junior successfully completed all the tasks DARPA assigned. The first was a safety test that ensured that the team could remotely stop the car from a speed of 20 miles per hour. The other tasks included navigating within a lane, stopping at stop signs, making U-turns, avoiding obstacles, and following driving instructions that DARPA provides.
Junior will move on to another test run in October and, if all goes well, eventually compete in the final round of the Urban Challenge on November 3.
A Roomba recorded a woman on the toilet. How did screenshots end up on Facebook?
Robot vacuum companies say your images are safe, but a sprawling global supply chain for data from our devices creates risk.
A startup says it’s begun releasing particles into the atmosphere, in an effort to tweak the climate
Make Sunsets is already attempting to earn revenue for geoengineering, a move likely to provoke widespread criticism.
10 Breakthrough Technologies 2023
These exclusive satellite images show that Saudi Arabia’s sci-fi megacity is well underway
Weirdly, any recent work on The Line doesn’t show up on Google Maps. But we got the images anyway.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.