Like a computer, the brain has a storage capacity that determines how much memory it can hold. However, while computing power can be neatly measured in bits, the amount of information that the brain can process depends upon the strength of the electrical signals that are passed between neurons. Understanding how these signals are mediated is, therefore, essential if we are to discover the brain’s overall memory capacity.
Although a precise measurement for this capacity has never been agreed upon, a new study that appeared in the journal eLife has suggested that previous rough estimates may have significantly underestimated the brain's memory storage capabilities. In fact, they suggest our memory capacity may be as much as ten times greater than we thought.
Neuronal cells communicate with one another via appendages called axons and dendrites, with the former carrying electrical signals away from the cell body and the latter conducting these impulses towards the cell body. An axon of one neuron meets with a dendrite of another at a junction called a synapse, the size of which determines the strength of the signals that can be transmitted. The number of different signal strengths that can be generated at each synapse dictates how much information can be processed, which means the brain’s overall memory capacity depends on the number of synapses it contains and the number of possible synaptic strengths.
Until recently, it had been presumed that synapses existed in a relatively low number of different sizes. However, the study authors discovered something that challenged this: first they noticed some axons formed two synapses with a single dendrite, and that these differed in size by a very small amount – typically around eight percent.
Synaptic strength is much more variable than previously thought, according to a new study. vitstudio/Shutterstock
This suggests that synaptic size can vary by very precise measures, and that signal strengths can subsequently be very intricately controlled. Using a process called serial section electron microscopy to create a 3D reconstruction of a portion of brain tissue, the team identified 26 different possible synaptic strengths, which means each synapse should be able to generate 4.7 bit of memory. This is roughly ten times more than had previously been thought.
To regulate this strength, synapses change size in response to the intensity of the information being delivered along the axon – a process known as synaptic plasticity. This occurs via an influx of calcium into the neck of the dendrite, thereby increasing the surface area of the synapse and allowing more signals to be transmitted across the gap.
Since the brain contains several trillion synapses, the researchers believe their calculations point to a truly staggering processing power. Study coauthor Terry Sejnowski explained in a statement that “our new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web,” adding that we might be able to use this information to improve computer design.