Skip to Content
Smart cities

Uber’s self-driving truck plan relies heavily on humans

March 6, 2018

In a video released today, the ride-hailing giant laid out plans for how its self-driving trucks might fit into—and shake up—the trucking world.

Logistics, logistics, logistics: Uber’s idea is to coordinate exchanges between short-hauling, human-piloted trucks and long-haul self-driving vehicles at transfer stations around the US. Humans will handle the tighter roads close to cities and leave the interstates to AI-powered big rigs (see “10 Breakthrough Technologies of 2017: Self-Driving Trucks”).

For example: Uber featured two (human) truck drivers in the video. Mark, on his way from Los Angeles in a typical 18-wheeler, meets up in Arizona with Larry, the pilot of a self-driving truck coming from the Midwest. They exchange their trailers, an action Uber says will “require the hands-on work only truckers can do” (a nod to concerns that self-driving trucks will eliminate jobs). Mark then heads back to California, and Larry heads east on a long haul.

Takin’ it to the streets: According to the New York Times, Uber’s self-driving trucks have already been hauling commercial cargo on highways in Arizona for the past few months, hinting that wider application of this plan might not be too far off.

Want to stay up to date on the future of work? Sign up for our newest newsletter, Clocking In!

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.