Menu

Blog

Oct 13, 2020

A New Brain-Inspired Learning Method for AI Saves Memory and Energy

Posted by in categories: neuroscience, robotics/AI

Interesting Eric Klien


That prompted the researchers, who are part of the Human Brain Project, to look at two features that have become clear in experimental neuroscience data: each neuron retains a memory of previous activity in the form of molecular markers that slowly fade with time; and the brain provides top-down learning signals using things like the neurotransmitter dopamine that modulates the behavior of groups of neurons.

In a paper in Nature Communications, the Austrian team describes how they created artificial analogues of these two features to create a new learning paradigm they call e-prop. While the approach learns slower than backpropagation-based methods, it achieves comparable performance.

More importantly, it allows online learning. That means that rather than processing big batches of data at once, which requires constant transfer to and from memory that contributes significantly to machine learning’s energy bills, the approach simply learns from data as it becomes available. That dramatically cuts the amount of memory and energy it requires, which makes it far more practical to use for on-chip learning in smaller mobile devices.

Comments are closed.