Summary: Researchers have developed a method to measure synaptic strength, plasticity precision and information storage in the brain. Using information theory, researchers have discovered that synapses can store 10 times more information than previously thought.
The results improve understanding of learning, memory, and how these processes evolve or deteriorate. This breakthrough could propel research into neurodevelopmental and neurodegenerative disorders.
Highlights:
- Synaptic plasticity: The study measures synaptic strength, plasticity and information storage using information theory.
- Increased storage: Results show that synapses can store 10 times more information than previously thought.
- Research impact: This method can advance studies of learning, memory and brain disorders such as Alzheimer’s disease.
Source: Salk Institute
Each time you look through a set of vocabulary words flashcards, their definitions appear faster and easier. This process of learning and remembering new information strengthens important connections in your brain.
Remembering these new words and definitions more easily with practice is evidence that these neural connections, called synapses, can become stronger or weaker over time – a characteristic known as synaptic plasticity.
Quantifying the dynamics of individual synapses can be a challenge for neuroscientists, but recent computational innovations from the Salk Institute could change that and reveal new insights about the brain along the way.
To understand how the brain learns and retains information, scientists try to quantify how much stronger a synapse has become through learning and how much stronger it can become.
Synaptic strength can be measured by looking at the physical characteristics of synapses, but it is much more difficult to measure the precision of plasticity (whether synapses weaken or strengthen by a constant amount) and the amount of information that ‘a synapse can store.
Salk scientists have established a new method to explore synaptic strength, plasticity precision and the amount of information storage. Quantifying these three synaptic features can improve scientific understanding of how humans learn and remember, as well as how these processes change over time or deteriorate with age or disease.
The results were published in Neural computation on April 23, 2024.
“We are getting better at identifying exactly where and how individual neurons are connected to each other, but we still have a lot to learn about the dynamics of these connections,” says Professor Terrence Sejnowski, lead author of the study and holder of the Francis Crick chair in Salk.
“We have now created a technique to study the strength of synapses, the precision with which neurons modulate that strength, and the amount of information that synapses are capable of storing, leading us to discover that our brains can store 10 times more information than before.
As a message passes through the brain, it passes from one neuron to another, passing from the tip of one neuron to the taut tendrils, called dendrites, of another.
Each dendrite of a neuron is covered in tiny bulbous appendages, called dendritic spines, and at the end of each dendritic spine is the synapse, a small space where the two cells meet and an electrochemical signal is transmitted. Different synapses are activated to send different messages.
Some messages activate pairs of synapses, which live next to each other on the same dendrite. These pairs of synapses provide a fantastic research tool: if two synapses have identical activation histories, scientists can compare the strength of these synapses to draw conclusions about the precision of plasticity.
Since the same type and amount of information passed through these two synapses, did they each change in strength by the same amount? If so, their plasticity accuracy is high.
The Salk team applied information theory concepts to analyze pairs of synapses in a rat hippocampus (a part of the brain involved in learning and memory) to determine strength, plasticity and precision of plasticity.
Information theory is a sophisticated mathematical way of understanding information processing as input passing through a noisy channel and being reconstructed at the other end.
Importantly, unlike methods used in the past, information theory takes into account the noise of the brain’s many signals and cells, in addition to offering a discrete (somewhat) unit of information to measure the amount of information stored at a synapse.
“We divided the synapses by strength, of which there were 24 possible categories, and then compared pairs of special synapses to determine how precisely the strength of each synapse is modulated,” explains Mohammad Samavat, first author of the study and postdoctoral researcher in Sejnowski’s lab.
“We were excited to find that the pairs had very similar dendritic spine sizes and synaptic strengths, meaning the brain is very precise when it weakens or strengthens synapses over time.”
In addition to noting the similarities in the strength of synapses within these pairs, which translates into a high level of plasticity precision, the team also measured the amount of information contained in each of the 24 strength categories. Despite differences in the size of each dendritic spine, each of the 24 synaptic strength categories contained a similar amount (between 4.1 and 4.6 bits) of information.
Compared to older techniques, this new approach using information theory is (1) more in-depth, accounting for 10 times more information storage in the brain than previously thought, and (2) scalable, meaning it can be applied to diverse and large datasets to gather information about other synapses.
“This technique is going to be a huge help to neuroscientists,” says Kristen Harris, a professor at the University of Texas at Austin and author of the study.
“Having this detailed look at synaptic strength and plasticity could really propel research into learning and memory, and we can use it to explore these processes in all the different parts of the human brain, the animal brain, the young brain and the old brain.
Sejnowski says future work from projects such as the National Institutes of Health’s BRAIN initiative, which established an atlas of human brain cells in October 2023, will benefit from this new tool.
In addition to scientists who catalog the types and behaviors of brain cells, this technique is of interest to those who study problems with information storage, such as in the case of Alzheimer’s disease.
In the years to come, researchers around the world could use this technique to make exciting discoveries about the human brain’s ability to learn new skills, remember everyday actions, and store short- and long-term information.
About this latest research on synaptic plasticity
Author: Terrence Sejnowski
Source: Salk Institute
Contact: Terrence Sejnowski – Salk Institute
Picture: Image is credited to Neuroscience News
Original research: Free access.
“Synaptic information storage capacity measured with information theory” by Terrence Sejnowski et al. Neural computation
Abstract
Synaptic information storage capacity measured with information theory
Variation in synapse strength can be quantified by measuring the anatomical properties of synapses. Quantifying the precision of synaptic plasticity is fundamental to understanding information storage and retrieval in neural circuits.
Synapses from the same axon on the same dendrite have a common history of coactivation, making them ideal candidates for determining the precision of synaptic plasticity based on the similarity of their physical dimensions.
Here, the precision and amount of information stored in synapse dimensions were quantified with Shannon’s information theory, expanding previous analysis using signal detection theory ( Bartol et al., 2015 ).
The two methods were compared using the volumes of the spinal dendritic head in the middle of the stratum radiata of hippocampal area CA1 as well-defined measures of synaptic strength.
Information theory has delineated the number of distinct synaptic strengths based on non-overlapping groups of dendritic spine head volumes. Shannon entropy was applied to measure synaptic information storage capacity (SISC) and resulted in a lower limit of 4.1 bits and an upper limit of 4.59 bits of information based on 24 distinct sizes.
We further compared the distribution of distinguishable sizes and a uniform distribution using the Kullback-Leibler divergence and found that there was an almost uniform distribution of spine head volumes between sizes, suggesting optimal use of distinguishable values.
Thus, SISC provides a new analytical measure that can be generalized to probe synaptic strengths and plasticity capacity in different brain regions of different species and among animals raised under different conditions or during learning. How brain diseases and disorders affect the precision of synaptic plasticity can also be studied.