Skip to Content

Intel and AMD Team Up to Take On Nvidia’s AI Chip Dominance

November 6, 2017

Nvidia is the biggest name in graphic processing units, the fast, powerful computer chips originally created for video games that are increasingly behind many artificial-intelligence projects. Other chipmakers are scrambling to catch up, going so far as to partner with companies that had been their fierce rivals.

The Wall Street Journal (paywall) reported Monday that Intel and AMD are working together on a new laptop chip meant to challenge Nvidia’s dominance. The chip will use an Intel processor and an AMD graphics unit and is designed to be “thin and lightweight but powerful enough to run high-end video games.” And if it’s good enough for video games, it’s likely to be good enough for many AI applications.

This is not Intel’s first effort to take on Nvidia. Intel bought Nervana, Movidius, and Mobileye in the last two years to expand its artificial-intelligence expertise. In September, the company revealed a neuromorphic chip, a new type of chip technology that is based on the human brain. It’s largely untested, but Intel is hoping to validate the neuromorphic approach by working with universities and research institutions on a product that could launch sometime next year. In October, it unveiled the Nervana Neural Network Processor family, a set of chips meant for data centers. Nvidia is still a small provider for data centers compared with Intel—but it has been creeping up, with sales tripling in 2017 to $1 billion.

Companies that aren’t traditionally chip manufacturers are also getting into the game. Google has Tensor Processing Units (TPUs), IBM is working on a chip. and Apple rolled one out recently, too. Much as Intel made a fortune becoming the dominant chipmaker for computers in the 20th century, the prize of being the chief hardware provider for the artificial-intelligence revolution is the kind of thing corporate legacies are made of—oh, and there’s a ton of money to be made, too.

Keep Reading

Most Popular

Geoffrey Hinton tells us why he’s now scared of the tech he helped build

“I have suddenly switched my views on whether these things are going to be more intelligent than us.”

ChatGPT is going to change education, not destroy it

The narrative around cheating students doesn’t tell the whole story. Meet the teachers who think generative AI could actually make learning better.

Meet the people who use Notion to plan their whole lives

The workplace tool’s appeal extends far beyond organizing work projects. Many users find it’s just as useful for managing their free time.

Learning to code isn’t enough

Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at with a list of newsletters you’d like to receive.