The results of a new study offer a brand new insight into brain learning.
In a paper that is now published in the journal Scientific Reports, the authors describe how they came to this conclusion after studying computer models of neurons and cell cultures.
In the vast neural network of the brain, neurons behave like tiny microchips that take in inputs through their dendrites, and — when certain conditions are reached — create outputs using their axons.
Axons, in turn, are connected to the dendrites of other neurons through links called synapses. There are many more synapses per neuron than dendrites.
A significant result of the new research is that, because it proposes that learning takes place in dendrites and not synapses, the learning parameters for each neuron are much fewer than previously thought.
“In this new dendritic learning process,” notes senior study author Prof. Ido Kanter, of the Gonda Interdisciplinary Brain Research Center at Bar-Ilan University in Israel, “there are a few adaptive parameters per neuron, in comparison to thousands of tiny and sensitive ones in the synaptic learning scenario.”
Another important result of the new study is that the process of learning happens much faster in the new dendritic model than the traditional synaptic model.
The results may have important implications for treatments of brain disorders and the design of computer applications — such as “deep learning algorithms” and artificial intelligence — which are based on mimicking the way that the brain works.
The researchers anticipate that, in the case of the latter, their study opens the door to the design of more advanced features and much faster processing speeds.
The traditional, synaptic model of learning is rooted in pioneering work by Donald Hebb that was published in 1949 in the book The Organization of Behavior.
That model, which Prof. Kanter and his colleagues refer to as “learning by links,” proposes that the “learning parameters” that change during the process of learning reflect the number of synapses, or links, per neuron, which are the computational units in the neural network.
‘Learning by nodes’
In their new model — which they refer to as “learning by nodes” — the researchers propose that the learning parameters reflect not the number of synapses, of which there are many per neuron, but the number of dendrites, or nodes, of which there are only a few per neuron.
Therefore, they explain, “in a network of connecting neurons,” the number of learning parameters per neuron in the synaptic model is “significantly larger” than the number in the dendritic model.
The main purpose of their study was to compare the “cooperative dynamical properties between synaptic (link) and dendritic (nodal) learning scenarios.”
The study authors conclude that their results “strongly indicate that a faster and enhanced learning process occurs in the neuronal dendrites, similarly to what is currently attributed to the synapses.”
Weak synapses play key role in learning
Another significant finding of the study is that it seems that weak synapses, which account for most of the brain and were thought to play an insignificant role in learning, are actually very important.
The authors note that “the dynamics is counterintuitively mainly governed by the weak links.”
It seems that, in the dendritic model, the weak synapses cause the learning parameters to oscillate rather than move to “unrealistic fixed extremes,” as in the synaptic model.
Prof. Kanter summarizes the results by drawing comparisons with how we should measure the quality of air.
“Does it make sense,” he asks, “to measure the quality of air we breathe via many tiny, distant satellite sensors at the elevation of a skyscraper, or by using one or several sensors in close proximity to the nose?”