Hebb’s Rule Revisited
The postulate “fire together, wire together” suggests a general mechanism of synaptic plasticity, which is thought to underly network wiring and cell assembly formation. In a new study published in PLOS Computational Biology, three researchers at the Bernstein Center Freiburg shed new light on this process, which is fundamental for robust brain function.
Associative memory. Illustration by René Descartes (1596–1650), L'Homme, published 1677.
It is a success story of theory-experiment interaction in neuroscience. The idea behind is discussed in every textbook that covers the cellular mechanisms of learning in the brain:
“When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.”
Condensed into a one-liner: “Neurons wire together if they fire together”. This clear-sighted depiction of what may happen in the brain during learning has been formulated by Donald O. Hebb in his book “The Organization of Behavior”, published in 1949. Up to the present day, there have been a great number of experimental reports supporting this general principle. But how exactly are nerve cells doing it? How can a neuron know that its fellow neurons want to associate with it? Which signals are transmitted, and what is the trigger for the “growth process or metabolic change” Hebb has speculated about?
A mechanistic description that clearly delineates cause and effect can only be inferred from experiments in biological neurons. But theory can help designing such experiments. The prevailing interpretation of Hebbian learning assumes that correlation of neuronal activity is the key. This is indeed one possibility how to implement Hebb’s rule: correlation induces growth. And this is how most scientists would conceive associative learning algorithms. But in a biological network, correlation for each and every pair of neurons needs to be measured and represented somewhere. Computer simulations of this setting are straightforward, but it is surprisingly difficult to establish networks that maintain stability while they learn. Something is missing.
In their new paper, Júlia Gallinaro, Nebojša Gašparović and Stefan Rotter from the Bernstein Center Freiburg have now described and analyzed a scenario, which corresponds to a less intricate and more robust implementation of Hebb’s rule. They assume that each neuron takes care of itself by functioning like a thermostat, replacing temperature by activity: If the neuron’s activity is too high, the number of its own incoming and outgoing excitatory synapses is reduced. If the activity is too low, new excitatory synapses are grown. Deletion of contacts and linking new ones into the network is done randomly. This ensures that neurons which fire together eventually also wire together – without having to monitor their own correlations explicitly. In computer simulations, this rule based on homeostatic structural plasticity works extremely well. A number of learning paradigms well-known in psychology, like classical conditioning or associative memory, can be easily implemented in terms of this rule. Experimenters may therefore consider including this possibility in their search for the mechanisms underlying Hebbian plasticity.