Don’t throw out your CPUs just yet, but there may be a new way to run your neural networks.
In the regular world of computing—whether you’re running exotic deep-learning algorithms or just using Excel—calculations are usually performed on a processor while data is passed back and forth to the memory. That works perfectly well, but some researchers have argued that performing calculations in memory itself would save time and energy that is usually used to move data around.
And that’s exactly the concept that a team from IBM Research in Zurich has now applied to some AI algorithms. The team has used a grid of one million memory devices, pictured above, which are all based on a phase-change material called germanium antimony telluride. The alloy’s special trick is that, when it’s hit by an electrical pulse, its state can be changed—from amorphous, like glass, to crystalline, like metal, or vice versa.
Don’t settle for half the story.
Get paywall-free access to technology news for the here and now.