This week, Intel will show off a chip that learns to recognize objects in pictures captured by a webcam. Nothing fancy about that, except that the chip uses about a thousandth as much power as a conventional processor.
The device, called Loihi, which Intel is putting through its paces at the Consumer Electronics Show (CES) in Las Vegas, is a neuromorphic chip—one that mimics, in a simplified way, the functioning of neurons and synapses in the brain.
The best AI algorithms already use brain-like programs called simulated neural networks, which rely on parallel processing to recognize patterns in data—including objects in images and words in speech. Neuromorphic chips take this idea further by etching the workings of neural networks into silicon. They are less flexible and powerful than the best general-purpose chips, but being specialized to their task makes them very energy efficient, and thus ideal for mobile devices, vehicles, and industrial equipment.
The idea of neuromorphic chips has been around for decades, but the technology may finally be ready to find its commercial niche. Across the tech industry, progress in AI has inspired new research into hardware capable of using machine-learning algorithms more efficiently.
Chris Eliasmith, a professor who studies neuroscience and computer architectures at the University of Waterloo in Canada, says the biggest challenge with neuromorphic chips in the past has been scaling them up. “This is one thing I really like about Intel entering the space,” he says. “They have the resources to push things ahead quickly.”
The chip is part of Intel’s attempt to reinvent itself. The company can no longer bank on delivering ever-faster processors, as Moore’s Law is bumping up against the laws of physics (see “Moore’s Law Is Dead. Now What?”). Meanwhile, the publicity for the new device offers some respite from the fallout caused by a recently revealed security flaw affecting hundreds of millions of Intel chips.
The company also announced at CES that it has built a relatively large new quantum computing chip, a device that exploits the weird and wonderful rules of quantum physics to do certain types of computation with incredible speed. That chip, called Tangle Lake, contains 49 quantum bits, or “qubits.”
Intel hasn’t yet revealed other details, including how reliable the quantum chip is. The news puts the company on a par with IBM, which recently unveiled a 50-qubit chip. However, while the devices are approaching a stage where they are capable of performing useful work, it isn’t yet clear how they might be used beyond a few niches such as cracking codes and modeling materials.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
Driving companywide efficiencies with AI
Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.
Generative AI deployment: Strategies for smooth scaling
Our global poll examines key decision points for putting AI to use in the enterprise.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.