Hello,

We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not an Insider? Subscribe now for unlimited access to online articles.

Business Impact

Why Intel’s Job Cuts May Be Just the Beginning

The once-dominant chipmaker is cutting 12,000 workers, and several emerging technological trends may cause even more difficulty.

Intel is cutting 12,000 workers as it faces the financial consequences of underestimating a profound shift in computing from desktop computers to pocket-sized devices.

And more trouble may lie ahead. The rate at which Intel makes technological advances suddenly seems to be slowing, and other looming trends, including artificial intelligence and perhaps virtual reality, look set to benefit a different kind of computer architecture.

The job cuts are a sign that Intel misjudged the speed with which people would abandon desktops in favor of smartphones and tablets, and failed to reposition its product line to ride that revolution. Only last week the research company Gartner reported that PC shipments were down 9.6 percent in the first quarter of the year.

Intel is perhaps also guilty of focusing too heavily on wringing ever-more power out of computer chips, when power-efficiency is just as important in mobile devices. Intel does have a line of mobile processers, but most mobile devices are based on a rival architecture licensed from a British company called ARM.

The company is now finding that the rate at which it can squeeze twice as much power out of its chips, something dubbed Moore’s Law after the company’s founder, Gordon Moore, is slowing down.

And while Intel says it will refocus its attention on cloud computing and devices for the Internet of things, it risks missing out on several up-and-coming opportunities. Artificial intelligence and virtual reality are already feeding demand for a completely different type of chip architectures.

Last week, I spent a few days at a developer conference in San Jose organized by Nvidia, a chip company that makes graphics processing units, or GPUs. This type of chip is especially good for the kind of parallel computations companies are harnessing to perform deep learning (a powerful kind of machine learning); and of course they are geared toward rendering the highly realistic 3-D environments needed for virtual reality. Indeed, the Nvidia event was filled with demos of self-driving cars, deep-learning systems, and virtual-reality headsets.

So beyond cutting jobs, Intel might need to think about how it can feed the industry’s appetite for AI and VR if it doesn’t want to miss the next big shift in how we use computers. 

(Read more: New York Times, Gartner, “Intel Puts the Brakes on Moore’s Law”)

Time is running out to register for EmTech Digital. You don’t want to miss expert discussions on artificial intelligence.

Learn more and register
More from Business Impact

How technology advances are changing the economy and providing new opportunities in many industries.

Want more award-winning journalism? Subscribe to Insider Basic.
  • Insider Basic {! insider.prices.basic !}*

    {! insider.display.menuOptionsLabel !}

    Six issues of our award winning print magazine, unlimited online access plus The Download with the top tech stories delivered daily to your inbox.

    See details+

    What's Included

    Unlimited 24/7 access to MIT Technology Review’s website

    The Download: our daily newsletter of what's important in technology and innovation

    Bimonthly print magazine (6 issues per year)

/3
You've read of three free articles this month. for unlimited online access. You've read of three free articles this month. for unlimited online access. This is your last free article this month. for unlimited online access. You've read all your free articles this month. for unlimited online access. You've read of three free articles this month. for more, or for unlimited online access. for two more free articles, or for unlimited online access.