Skip to Content
Uncategorized

Won’t you be my neighborhood autonomous vehicle?

Optimus Ride gets cars driving safely on their own by limiting where they operate.
February 27, 2019
Photographs by Doug Levy

If you want to see the future of transportation, hop a train out of Boston to the suburb of South Weymouth. There, in a parking lot adjacent to the station, a fleet of seven cars have been picking people up and driving them to a nearby housing development—all without a human placing a hand on the steering wheel.

On a recent night, one of the vehicles carried a twentysomething woman to her condominium about a mile away. After the woman hopped in the car, which looked like a golf cart with an enclosed body and doors, she used a touch-screen tablet to select a route from a menu. A software operator sitting in the front passenger seat authorized the trip with a tap on his laptop, and a test driver in the driver’s seat pressed a button on the vehicle’s dashboard to put the car into self-driving mode. Though the driver was there for safety, in case something unexpected happened and someone needed to take the wheel, the vehicle planned the route and drove itself.

The technology comes from Optimus Ride, a Boston-based startup with deep ties to MIT that aims to be among the first companies to transport mass quantities of people in driverless vehicles. “The vehicle should arrive empty, pick you up, and drive you someplace else,” says CEO Ryan Chin, SM ’00, SM ’04, PhD ’12. “There should be no operators on board; that’s our goal.”

No company anywhere in the world has yet managed to provide a completely driverless service to a large number of people. That includes Waymo, a subsidiary of Alphabet that is developing autonomous vehicles, and the self-driving divisions of Uber and Lyft. There are plenty of other companies in the race, too, from major automakers to the electric-car maker Tesla to startups.

Optimus Ride aims to zip in front of these competitors by targeting a smaller market and a more specific type of driving. Instead of trying to transport people anywhere they might want to go, the startup limits its vehicles to areas with virtual boundaries, defined by GPS. These could include planned communities—as in South Weymouth—as well as university or corporate campuses, resorts, waterfronts, and areas hosting festivals or concerts.

The approach has allowed Optimus Ride to become one of only a few self-driving-­vehicle companies currently generating revenue. Those rides it provides in South Weymouth? A real estate developer pays the startup to offer them as an amenity to residents of Union Point, a smart-city housing development at the 1,500-acre site of a former naval air station. Chin says he’s negotiating similar arrangements with more than a dozen other customers, including some in Asia and the Middle East. If all goes as planned, the startup could bring self-driving vehicles to a mass audience around the globe within the next two years—and perhaps even turn a profit. 

Optimus Ride cofounders Albert Huang, PhD ’10; Sertac Karaman, SM ’09, PhD ’12; Ramiro Almeida; Ryan Chin, SM ’00, SM ’04, PhD ’12; and Jenny Larios Berlin, SM ’15, MBA ’15, on the test track at the company’s headquarters.

MIT roots

The road to launching a self-driving startup began at MIT. All five of Optimus Ride’s cofounders studied or worked at the Institute before establishing the company in 2015. At the time, Chin was the managing director of the City Science Initiative at the Media Lab. Ramiro Almeida, who is now Optimus Ride’s chief strategy officer, was a visiting scholar in the same project. Sertac Karaman, SM ’09, PhD ’12, was an assistant professor of aeronautics and astronautics; today he’s an associate professor in addition to serving as Optimus Ride’s president and chief scientist. Chief marketing and operating officer Jenny Larios Berlin, SM ’15, MBA ’15, had just earned a master’s in city planning and an MBA from Sloan. And Albert Huang, PhD ’10, now Optimus Ride’s chief technology officer, was working for Google’s research and development organization after earning a doctorate in electrical engineering and computer science.

All five had experience either designing autonomous vehicles or deploying transportation systems, so they were aware of advances being made in self-driving technology. Karaman and Huang had helped build the driverless car that MIT entered in the 2007 Defense Advanced Research Projects Agency (DARPA) Urban Challenge, the first serious contest involving autonomous vehicles in an urban environment. (Their team was one of just six to finish the race.) Chin had designed a shared-use electric vehicle with robotic features as part of the Media Lab’s CityCar project. Larios Berlin worked at the car-sharing company Zipcar, where she introduced the service to several university campuses. Almeida had helped lead the development of the first subway line in Quito, Ecuador, as an advisor to the city.

Almeida, who had also founded the Spanish version of Wired and several startups, helped bring everyone together. “The technology was mature enough and the market was just getting started,” he says. “I thought there was a very important opportunity for us to actually create a company.” It didn’t take them long to come up with a name—an homage to the character Optimus Prime from the Transformers franchise. “We were thinking about sentient technologies, and that evolved into talking about Transformers, because that was the first story created around sentient technologies that became a global success,” says Almeida. Chin says the group’s “general fascination with heroic robots” also played a part.

Raising funding was more challenging. Instead of aiming for full automation, or what the automotive industry calls “level 5” autonomy—which would allow a driverless car to operate wherever a human driver could—the cofounders wanted to create level 4 cars, which can drive only in areas “geofenced” by GPS. They thought that approach would let them develop and deploy autonomous technology more safely, quickly, and cost-effectively than competitors because the driving would be less complicated.

Venture capitalists didn’t always agree. “When we started, other companies were talking about developing [self-driving] technology that had no restrictions, that would be capable of doing anything people asked them to do,” says Almeida. “A lot of investors looked at us and said, ‘Guys, you’re not going to make it.’”

The cofounders persevered and have so far raised $23.25 million in two rounds of financing. Its current investors view the company’s narrow focus as a strength. “Private campuses have less traffic, pedestrians, and obstacles,” says Sanjay Aggarwal ’95, MBA ’03, a partner at the venture capital firm F-Prime, which invested in Optimus Ride in 2017. “It’s inherently easier to deploy this type of technology there.”

Chin says the MIT Center for Real Estate also helped with funding by introducing him to developers who turned into clients. Optimus Ride now employs nearly 100 people—roughly a third of them have PhDs, and almost half are MIT alumni—and aims to hire at least 100 more engineers in 2019. “We already know how to drive autonomously on our sites,” says Chin. “Now we’re learning how to scale, to get from a few sites to a few hundred, and to tens of thousands of vehicles.”

Optimus Ride outfits its vehicles with laser and camera sensors (shown here) as well as accelerometers, gyroscopes, odometers, and GPS devices.

Sensor fusion

Before Optimus Ride puts a vehicle on the road, it retrofits it with sensors and loads it with software. The startup develops all of the software the vehicle needs, but virtually none of the hardware. It currently buys its vehicles, which are electric and seat either four or six people at a time, from a company called Polaris Industries. (Known as neighborhood electric vehicles, in Massachusetts they’re officially classified as low-speed vehicles and can only go up to 25 miles per hour.) At Optimus Ride’s headquarters—a converted warehouse space on Boston’s industrial waterfront—technicians fit the cars out with computer-­vision cameras, GPS devices, accelerometers, gyroscopes, lidars, and odometers that measure the rotation of the rear wheels. Data from those sensors flows to a GPU processor, which calculates what the car should do and sends instructions to a control board that handles steering, acceleration, and braking. (The company is also starting to work with another vehicle manufacturer to integrate this hardware directly into production lines.)

Machine learning analysis of the data the sensors, lasers and cameras collect produces a map of the car’s real-time progress—and any obstacles—along its route.

After software is installed, Optimus Ride tests the car on a course set up in a 10,000-square-foot driving area inside its facility. A person sitting in the passenger seat directs the vehicle via a laptop. To check that the various systems, such as the brakes, are working correctly, systems and testing engineers typically drive the vehicle within lines marked on the concrete floor with colored tape and stop at a traffic light—on loan from the City of Boston—and a pedestrian crossing installed on the course. Once the vehicle masters those maneuvers, it moves outside to the streets of the Seaport neighborhood so operators can see how it reacts in more complex situations.

When Optimus Ride first deploys vehicles at a particular site, engineers and test drivers drive a few of them around manually for several days to capture information about the surroundings. The resulting data is used to create maps that contain contextual information about lane markers, street signs, and landmarks.

Vehicles later deployed at that site can use that “base map” to navigate. They also help keep it current. “We’re always capturing data, so every vehicle that drives on that same road gives us more information we can use to improve the master map,” says Chin.

Focusing on small, geofenced areas also enables Optimus Ride to learn what Karaman, the chief scientist, calls the “culture” of driving in a specific place. To operate at the South Weymouth station, the startup had to teach its software and systems how to navigate crowds of commuters jostling to get home. In Union Point, it had to train them to recognize and then ignore tall, thin rods that the developer stuck on the sides of the road to guide snowplows. In the Seaport, the vehicles must navigate wide roads that aren’t well marked, maneuvering around the T’s Silver Line buses, dense traffic around the Design Center, commercial trucks, and seagulls. And while drivers at Union Point tend to be well behaved, “the Seaport is a great testing place because it’s just insane,” says John Sgueglia, SM ’15, who oversees vehicle systems and testing for Optimus Ride.

Going on an Optimus ride
When Optimus Ride tests its vehicles on public roads, it never wants to cause an accident or be at fault for anything. So it programs the cars to be extra cautious.
That cautiousness permeated most of the ride I took in one of the startup’s vehicles in Boston’s Raymond L. Flynn Marine Park. The industrial site is full of pedestrians, city buses, commercial trucks, and the occasional rabbit or seagull, so most of the time I appreciated the risk-averse behavior. But when we stopped at an intersection far longer than a human driver would have waited, I got a bit restless.
The issue: the oncoming traffic had the right of way and the vehicle knew it, because Optimus Ride encodes rules of the road into its software. So we waited until the road was clear before moving—even though we were driving straight ahead and the other cars would almost certainly have let us go, impatient though they might have been to turn in front of us.
Soon after we made it through the intersection, we encountered a car that was stopped with its blinkers on. Optimus Ride programs its vehicles not to cross a double yellow line, ever, so we had to wait until the car moved to the side of the road before we could proceed.
At one point I thought the vehicle should have exercised more restraint, though. Back at the intersection, a man walked in front of us just as we were finally about to move forward. The vehicle advanced about a foot and then stopped itself. The pedestrian still had plenty of room, but he gave us a dirty look.
Overall, however, the vehicle drove the way you’d want it to: keeping a safe distance from other cars, stopping at stop signs, and decelerating automatically when it crossed railroad tracks or descended a ramp. “Humans take way more risk driving than we think we do,” says system engineer and test driver John Sgueglia, SM ’15. “We could encode that way of thinking into our vehicles, but would you, as a passenger, really want us to?”

“‘Culture’ refers to the way people drive, but also the way the environment is set up,” says Karaman. “Understanding how people actually behave in these environments is critical to being able to drive fully autonomously 100% of the time someday.”

Optimus Ride uses machine learning to combine and analyze data from different types of sensors, an approach known as sensor fusion. That ability to quickly and effectively integrate data from multiple sensors is critical: the data is fed into Optimus Ride’s software, which produces a map that shows both the car’s real-time progress along its planned route and any obstacles along that path. Other vehicles show up in green, and pedestrians show up in orange, but Optimus Ride doesn’t give more weight to any particular type of obstacle; the goal, after all, is to avoid hitting any of them.

Equitable mobility

Some researchers think autonomous vehicles will exacerbate urban congestion by making driving so easy and convenient that people will favor cars over public transit. Chin says he doesn’t think Optimus Ride will do that, because it operates shared electric vehicles and is designed to transport people from home or work to the nearest bus, subway, and train stops. “We want to complement mass transit,” he says. “If we feed into those systems appropriately, that will increase ridership, which will allow them to invest in better signaling, more trains, larger stations, and so forth.”

Chin says MIT’s “general ethos” of regarding technology as a force for good helped shape these goals. “There’s this idea that if we develop really great self-driving technology, it will benefit people who don’t have good access to transportation, whether that’s because they’re economically challenged, discriminated against, or blind or deaf,” he says. “We’ve taken the stance that it’s important to do those things.”

To ensure that people with disabilities can use its vehicles, Optimus Ride conducted user research at the Perkins School for the Blind and made its user interface easily customizable. The startup also plans to widen access to more people soon. Its shuttles in South Weymouth transport only residents of the Union Point development, who book rides using a special mobile app. But other planned communities plan to provide the ride service to visitors—such as prospective tenants—for free. “You would just come to a community and vehicles would be lined up,” says Chin. “You’d open the door and get in, and it would tell you a bunch of places you could go.” Customers also expect to use the vehicles for other services, such as food and package delivery and trash pickup.

This vision of affordable, equitable, intelligent vehicles on demand might seem fanciful, but not to Kris Carter, who helps lead the Mayor’s Office of New Urban Mechanics in Boston. His agency oversees the city’s testing of autonomous vehicles and began working with Optimus Ride in 2017. “We think driverless cars could make our streets safer and give residents who don’t have great transit access some better options,” he says. “We’re excited about the technology they’re developing.”

What gives Carter confidence in Optimus Ride? Its MIT roots, in part. “They have a really nice blend of people who understand urban environments, as well as technology and how to run a business,” he says. “They can speak the same language as cities.” 

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.