A model for information storage in the brain reveals how memories decay with age.
Theoretical constructs called attractor networks provide a model for memory in the brain. A new study of such networks traces the route by which memories are stored and ultimately forgotten [1]. The mathematical model and simulations show that, as they age, memories recorded in patterns of neural activity become chaotic—impossible to predict—before disintegrating into random noise. Whether this behavior occurs in real brains remains to be seen, but the researchers propose looking for it by monitoring how neural activity changes over time in memory-retrieval tasks.
Memories in both artificial and biological neural networks are stored and retrieved as patterns in the way signals are passed among many nodes (neurons) in a network. In an artificial neural network, each node’s output value at any time is determined by the inputs it receives from the other nodes to which it’s connected. Analogously, the likelihood of a biological neuron “firing” (sending out an electrical pulse), as well as the frequency of firing, depends on its inputs. In another analogy with neurons, the links between nodes, which represent synapses, have “weights” that can amplify or reduce the signals they transmit. The weight of a given link is determined by the degree of synchronization of the two nodes that it connects and may be altered as new memories are stored.