Dedicated machine-learning hardware could help Google fight off rivals in an increasingly competitive cloud AI market.
Backstory: Last year, Google announced it had designed a new chip, called a tensor processing unit (TPU), built to crunch the math AI uses. At the time, it ran the chips itself and allowed just a select group of researchers to make use of them.
What's new: The New York Times reports that Google will allow other companies to make use of the hardware via the cloud. “We are trying to reach as many people as we can as quickly as we can,” Zak Stone, leader of Google’s TPU team, told the newspaper.
Why it matters: Putting AI in the cloud is big business. Google, Amazon, and Microsoft all provide AI software on their cloud servers, and China is joining the race, too. By offering dedicated hardware for AI grunt work, Google will hope to gain a competitive edge over the others.