Skip to Content

Why Intel’s Job Cuts May Be Just the Beginning

The once-dominant chipmaker is cutting 12,000 workers, and several emerging technological trends may cause even more difficulty.
April 20, 2016

Intel is cutting 12,000 workers as it faces the financial consequences of underestimating a profound shift in computing from desktop computers to pocket-sized devices.

And more trouble may lie ahead. The rate at which Intel makes technological advances suddenly seems to be slowing, and other looming trends, including artificial intelligence and perhaps virtual reality, look set to benefit a different kind of computer architecture.

The job cuts are a sign that Intel misjudged the speed with which people would abandon desktops in favor of smartphones and tablets, and failed to reposition its product line to ride that revolution. Only last week the research company Gartner reported that PC shipments were down 9.6 percent in the first quarter of the year.

Intel is perhaps also guilty of focusing too heavily on wringing ever-more power out of computer chips, when power-efficiency is just as important in mobile devices. Intel does have a line of mobile processers, but most mobile devices are based on a rival architecture licensed from a British company called ARM.

The company is now finding that the rate at which it can squeeze twice as much power out of its chips, something dubbed Moore’s Law after the company’s founder, Gordon Moore, is slowing down.

And while Intel says it will refocus its attention on cloud computing and devices for the Internet of things, it risks missing out on several up-and-coming opportunities. Artificial intelligence and virtual reality are already feeding demand for a completely different type of chip architectures.

Last week, I spent a few days at a developer conference in San Jose organized by Nvidia, a chip company that makes graphics processing units, or GPUs. This type of chip is especially good for the kind of parallel computations companies are harnessing to perform deep learning (a powerful kind of machine learning); and of course they are geared toward rendering the highly realistic 3-D environments needed for virtual reality. Indeed, the Nvidia event was filled with demos of self-driving cars, deep-learning systems, and virtual-reality headsets.

So beyond cutting jobs, Intel might need to think about how it can feed the industry’s appetite for AI and VR if it doesn’t want to miss the next big shift in how we use computers. 

(Read more: New York Times, Gartner, “Intel Puts the Brakes on Moore’s Law”)

Keep Reading

Most Popular

This new data poisoning tool lets artists fight back against generative AI

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. 

Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist

An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.

The Biggest Questions: What is death?

New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.

How to fix the internet

If we want online discourse to improve, we need to move beyond the big platforms.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.