Making AI more energy efficient with neuromorphic computing

Twenty-five years of pioneering work in neuromorphic computing at CWI is now bearing fruit. Thanks to algorithmic breakthroughs in training spiking neural networks many AI-applications can become much more energy-efficient.

Publication date
5 Mar 2024

As artificial intelligence (AI) grows exponentially, so is its carbon footprint. Since the deep learning revolution of 2012, training the largest deep neural networks has become three hundred thousand times more computationally intensive. Just training a large AI model like ChatGPT uses the annual carbon footprint of sixty people in the Western world.

Sander Bohté interviewed on his research by Nieuwsuur.
Sander Bohté interviewed on his research by Nieuwsuur.

One way to reduce AI’s energy consumption and carbon footprint is to replace the traditional Von Neumann-based computing architecture, which separates memory and processing units, with a new type of architecture, called neuromorphic computing. Neuromorphic computing mimics the architecture and behavior of the human brain. It integrates memory and processing units and also allows much better for parallel processing.

CWI senior researcher Sander Bohté started working on neuromorphic computing already in 1998 as a PhD-student, when the subject was barely on the map. In recent years, Bohté and his CWI-colleagues have realized a number of algorithmic breakthroughs in spiking neural networks (SNNs) that make neuromorphic computing finally practical: in theory many AI-applications can become a factor of a hundred to a thousand more energy-efficient. This means that it will be possible to put much more AI into chips, allowing applications to run on a smartwatch or a smartphone. Examples are speech recognition, gesture recognition and the classification of electrocardiograms (ECG).

“I am really grateful that CWI, and former group leader Han La Poutré in particular, gave me the opportunity to follow my interest, even though at the end of the 1990s neural networks and neuromorphic computing were quite unpopular”, says Bohté. “It was high-risk work for the long haul that is now bearing fruit.”

High energy efficiency

Spiking neural networks (SNNs) more closely resemble the biology of the brain. They process pulses instead of the continuous signals in classical neural networks. Unfortunately, that also makes them mathematically much more difficult to handle. For many years SNNs were therefore very limited in the number of neurons they could handle. But thanks to clever algorithmic solutions Bohté and his colleagues have managed to scale up the number of trainable spiking neurons first to thousands in 2021, and then to tens of millions in 2023.

To really achieve high energy efficiency, SNNs need special, so-called neuromorphic chips to run on. “Based on our algorithms, our Belgium research partner IMEC created a special neuromorphic chip, called μBrain”, says Bohté. “When our algorithm is run on this special chip, a factor of twenty in energy consumption is gained. That’s less than the theoretical gain, yet still very significant. For detecting heart defects, this means that you can implant an ECG-recording chip and it will run for a year on a single battery.”

Two major challenges

Thanks to decades of pioneering work and CWI’s national role, a community of SNN researchers has grown in the Netherlands in recent years. “We can now organize the annual Neuromorphic Computing Netherlands workshop with fifty to a hundred researchers,” says Bohté.

For the next five to ten years, Bohté sees two major challenges in his research. “The first is to further scale up SNNs by a factor of ten to a hundred so that we can also handle, for example, the smaller versions of Large Language Models, such as ChatGPT. That’s not trivial, and for that, we need new ideas.”

The second challenge is to enable neuromorphic chips to keep learning continuously, based on new data the chip receives. “That’s an unsolved problem, not just for SNNs but for AI in general”, says Bohté, “and here again we hope to draw new inspiration from biology.”

Author: Bennie Mols