Analogue chips can slash the energy used to run AI models
AI research uses vast amounts of energy, but new research shows that analogue devices can run models far more efficiently due to their unusual ability to carry out data storage and processing in the same place
By Matthew Sparkes
23 August 2023
The analogue chip that could increase AI efficiency
Ryan Lavine/IBM
An analogue computer chip can run an artificial intelligence (AI) speech recognition model 14 times more efficiently than traditional chips, potentially offering a solution to the vast and growing energy use of AI research and to the worldwide shortage of the digital chips usually used.
The device was developed by IBM Research, which declined New Scientist’s request for an interview and didn’t provide any comment. But in a paper outlining the work, researchers claim that the analogue chip can reduce bottlenecks in AI development.
There is a global rush for GPU chips, the graphic processors that were originally designed to run video games and have also traditionally been used to train and run AI models, with demand outstripping supply. Studies have also shown that the energy use of AI is rapidly growing, rising 100-fold from 2012 to 2021, with most of that energy derived from fossil fuels. These issues have led to suggestions that the constantly increasing scale of AI models will soon reach an impasse.
Advertisement
Another problem with current AI hardware is that it must shuttle data back and forth from memory to processors in operations that cause significant bottlenecks. One solution to this is the analogue compute-in-memory (CiM) chip that performs calculations directly within its own memory, which IBM has now demonstrated at scale.
IBM’s device contains 35 million so-called phase-change memory cells – a form of CiM – that can be set to one of two states, like transistors in computer chips, but also to varying degrees between them.
This last trait is crucial because these varied states can be used to represent the synaptic weights between artificial neurons in a neural network, a type of AI that models the way that links between neurons in human brains vary in strength when learning new information or skills, something that is traditionally stored as a digital value in computer memory. This allows the new chip to store and process these weights without making millions of operations to recall or store data in distant memory chips.