Skip to Content

Drivers Wanted

Self-driving trucks are an experiment, and we’re the guinea pigs.
February 22, 2017

Imagine driving down the interstate past an 80,000-pound tractor-trailer. Its driver’s hands aren’t touching the wheel.

Tech companies envision—and are investing in—a future in which thousands of such vehicles would navigate our roadways (see “10 Breakthrough Technologies: Self-Driving Trucks”). Most people don’t welcome this scenario, nor should they. A 2016 study conducted by researchers at the University of Michigan Transportation Research Institute found that 95 percent of U.S. motorists had concerns about sharing the road with autonomous trucks and trailers. Safety was the major worry.

Skilled, experienced drivers play a huge role in ensuring the safe operation of heavy vehicles. The value of a human in that truck won’t go away no matter what technology is developed.

James P. Hoffa

Those who advocate for self-driving cars often cite the fact that human error is largely responsible for most traffic deaths. But that doesn’t mean self-driving cars and trucks will be able to avoid those errors. An automated vehicle in Pittsburgh recently drove the wrong way up a one-way road. Last year in Florida a man using Tesla’s Autopilot feature was killed when the system failed to recognize a tractor-trailer in front of the car. These are not doomsday scenarios; these are legitimate concerns.

There are other worries: with cybersecurity breaches now a frequent topic in the news, what happens when not just one but a “platoon” of trucks is hacked? The risks to the public only increase as more vehicle systems are controlled by computer. Don’t forget that some of those trucks carry thousands of pounds of hazardous materials every day.

We don’t yet have any federal regulations regarding automated vehicles. The government has issued guidelines for testing them, but they’re voluntary guidelines for manufacturers, not regulations. A number of states allow testing for automated vehicles, but they all employ different standards.

Self-driving cars and trucks are an experiment. But our highways shouldn’t be experimental grounds where public safety is put at risk. Yes, we should strive to innovate and make progress, but we also need to ensure that changes are indeed advancements for the betterment of our society—including the driving public and our nation’s workers.

Anything man-made can fail. If that failure occurred in a heavy vehicle driving next to you, wouldn’t you want a driver behind the wheel?

James P. Hoffa is the general president of the International Brotherhood of Teamsters.

Keep Reading

Most Popular

Scientists are finding signals of long covid in blood. They could lead to new treatments.

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.