Dedicated machine-learning hardware could help Google fight off rivals in an increasingly competitive cloud AI market.
Backstory: Last year, Google announced it had designed a new chip, called a tensor processing unit (TPU), built to crunch the math AI uses. At the time, it ran the chips itself and allowed just a select group of researchers to make use of them.
What's new: The New York Times reports that Google will allow other companies to make use of the hardware via the cloud. “We are trying to reach as many people as we can as quickly as we can,” Zak Stone, leader of Google’s TPU team, told the newspaper.
Why it matters: Putting AI in the cloud is big business. Google, Amazon, and Microsoft all provide AI software on their cloud servers, and China is joining the race, too. By offering dedicated hardware for AI grunt work, Google will hope to gain a competitive edge over the others.
Learning to code isn’t enough
Historically, learn-to-code efforts have provided opportunities for the few, but new efforts are aiming to be inclusive.
IBM wants to build a 100,000-qubit quantum computer
The company wants to make large-scale quantum computers a reality within just 10 years.
Multi-die systems define the future of semiconductors
Multi-die system or chiplet-based technology is a big bet on high-performance chip design—and a complex challenge.
The inside story of New York City’s 34-year-old social network, ECHO
Stacy Horn set out to create something new and very New York. She didn’t expect it to last so long.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.