Intel buys into an AI chip that can transfer data 1,000 times faster
Intel and others are investing $13 million in Untether AI, a startup that's working on a novel type of chip for artificial intelligence that promises to perform neural-network calculations at warp speed.
Speedup: Untether, based in Toronto, Canada, has already developed a prototype device that transfers data between different parts of the chip 1,000 times more quickly than a conventional AI chip. That’s an impressive achievement, but it should be treated cautiously since the prototype is far larger than an actual chip—and because other factors will contribute to the overall performance of the finished device.
Bottleneck: One of the key challenges with modern chips is shuttling data from memory to the units used to perform logical operations. This is especially problematic as the amount of data that chips need to process increases, as is the case with AI applications such as face or voice recognition. Untether uses what’s known as “near-memory computing” to reduce the physical distance between memory and the processing tasks, which speeds up data transfer and lowers power consumption.
On edge: Untether is developing what’s known as an "inference chip." This is different from the sort of chip used to train a large neural network in a data center, which is a much bigger challenge to design and make. Instead, an inference chip is more akin to a chip that runs on a device like a smartphone or a camera. The deep-learning boom has led to a huge amount of commercial activity around such inference chips.
No Moore: Such new approaches are gaining momentum as Moore’s Law—the idea that microchips can be improved at a steady rate simply by scaling down the size of transistors that fit on them—shows signs of reaching its limit. Untether is not the only company working on in-memory computing for neural networks, in fact. One competitor, based in Redwood, CA, is Mythic.
FOMO: Untether’s design might be experimental, but one can understand why Intel would invest. The company has seen its dominance eroded in recent years with the rise of mobile devices that use alternative chip designs. It is now desperate not to miss out on the AI boom as well. A couple of years ago, Intel acquired Nervana, a startup developing chips for deep learning, and it is currently readying the first products based on those designs.
For more on the world of AI, sign up to our Webby-nominated newsletter The Algorithm here.
Deep Dive
Computing
How Rust went from a side project to the world’s most-loved programming language
For decades, coders wrote critical systems in C and C++. Now they turn to Rust.
Welcome to the oldest part of the metaverse
Ultima Online, which just turned 25, offers a lesson in the challenges of building virtual worlds.
A new paradigm for managing data
Open data lakehouse architectures speed insights and deliver self-service analytics capabilities.
Three ways networking services simplify network management
The right networking services orchestrate note-perfect network performance.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.