The brain stores 10x more information than expected – Neuroscience news

Resume: Researchers developed a method to measure synaptic strength, precision of plasticity and information storage in the brain. Using information theory, researchers discovered that synapses can store ten times more information than previously thought.

The findings advance the understanding of learning, memory and how these processes evolve or deteriorate. This breakthrough could stimulate research into neurodevelopmental and neurodegenerative disorders.

Key Facts:

  • Synaptic plasticity: Study measures synaptic strength, plasticity and information storage using information theory.
  • Increased storage: Findings show that synapses can store 10 times more information than previously thought.
  • Research impact: This method can advance research into learning, memory and brain disorders such as Alzheimer’s disease.

Source: Salk Institute

With each time you flip through a deck of vocabulary cards, their definitions will come up faster and easier. This process of learning and remembering new information strengthens important connections in your brain.

Being able to recall these new words and definitions more easily with practice is evidence that these neural connections, called synapses, can become stronger or weaker over time – a characteristic known as synaptic plasticity.

Quantifying the dynamics of individual synapses can be a challenge for neuroscientists, but recent computational innovations from the Salk Institute could change that, revealing new insights about the brain along the way.

To understand how the brain learns and retains information, scientists try to quantify how much stronger a synapse has become through learning, and how much stronger it can become.

Synaptic strength can be measured by looking at the physical characteristics of synapses, but it is much more difficult to measure the precision of plasticity (whether synapses become weaker or stronger in a consistent amount) and the amount of information a synapse can store.

Salk scientists have developed a new method to investigate synaptic strength, precision of plasticity and amount of information storage. Quantifying these three synaptic features could improve scientific understanding of how people learn and remember, as well as how these processes evolve over time or deteriorate with age or disease.

The findings were published in Neural computation on April 23, 2024.

“We are getting better at identifying exactly where and how individual neurons are connected, but we still have a lot to learn about the dynamics of those connections,” said Professor Terrence Sejnowski, senior author of the study and holder of the Francis Crick chair at Salk.

“We have now developed a technique to study the strength of synapses, the precision with which neurons modulate that strength, and the amount of information that synapses can store – leading us to discover that our brains can store ten times more information than before . thought.”

When a message travels through the brain, it jumps from neuron to neuron, flowing from the end of one neuron to the extended tendrils, called dendrites, of another neuron.

Each dendrite on a neuron is covered with small bulbous appendages called dendritic spines, and at the end of each dendritic spine is the synapse: a small space where the two cells meet and an electrochemical signal is sent. Different synapses are activated to send different messages.

Some messages activate pairs of synapses, which live close together on the same dendrite. These synapse pairs are a fantastic research tool: if two synapses have identical activation histories, scientists can compare the strength of those synapses to draw conclusions about the precision of plasticity.

Since the same kind and amount of information has passed through these two synapses, has each changed in strength to the same extent? If so, their plasticity accuracy is high.

The Salk team applied concepts from information theory to analyze synapse pairs from a rat’s hippocampus – a part of the brain involved in learning and memory – for strength, plasticity and precision of plasticity.

Information theory is an advanced mathematical way of understanding information processing as input passing through a noisy channel and being reconstructed at the other end.

Crucially, unlike methods used in the past, information theory takes into account the noise of the many signals and cells in the brain, in addition to providing a discrete unit of information (a bit) to reduce the amount of information stored in a synapse.

“We divided the synapses by strength, of which there were 24 possible categories, and then compared special synapse pairs to determine how exactly the strength of each synapse is modulated,” said Mohammad Samavat, first author of the study and a postdoctoral researcher in the lab by Sejnowski.

“We were pleased to find that the pairs had very similar dendritic spine sizes and synaptic strengths, meaning the brain is very precise about making synapses weaker or stronger over time.”

In addition to identifying the similarities in synapse strength within these pairs, which translates into a high degree of precision of plasticity, the team also measured the amount of information in each of the 24 strength categories. Despite differences in the size of each dendritic spine, each of the 24 synaptic strength categories contained a similar amount (between 4.1 and 4.6 bits) of information.

Compared to older techniques, this new approach, using information theory, is (1) more thorough, providing ten times more information storage in the brain than previously thought, and (2) scalable, meaning it can be applied to a variety of and large data sets to collect information about other synapses.

“This technique will be a tremendous help to neuroscientists,” said Kristen Harris, a professor at the University of Texas at Austin and author of the study.

“Having this detailed study of synaptic strength and plasticity could really boost research into learning and memory, and we can use it to investigate these processes in all different parts of the human brain, animal brain, young brain and old brain. .”

Sejnowski says future work from projects such as the National Institutes of Health’s BRAIN Initiative, which created an atlas of human brain cells in October 2023, will benefit from this new tool.

In addition to scientists cataloging brain cell types and behaviors, the technique is also exciting for those studying when information storage goes wrong, such as in Alzheimer’s disease.

In the coming years, researchers around the world could use this technique to make exciting discoveries about the human brain’s ability to learn new skills, remember everyday actions, and store information in the short and long term.

About this synaptic plasticity research news

Author: Terrence Sejnowski
Source: Salk Institute
Contact: Terrence Sejnowski – Salk Institute
Image: The image is credited to Neuroscience News

Original research: Open access.
“Synaptic Information Storage Capacity Measured with Information Theory” by Terrence Sejnowski et al. Neural computation


Abstract

Synaptic information storage capacity measured with information theory

Variation in the strength of synapses can be quantified by measuring the anatomical properties of synapses. Quantifying the precision of synaptic plasticity is fundamental to understanding information storage and retrieval in neural circuits.

Synapses from the same axon on the same dendrite have a common history of coactivation, making them ideal candidates for determining the precision of synaptic plasticity based on the similarity of their physical sizes.

Here, the precision and amount of information stored in synapse dimensions were quantified using Shannon information theory, extending previous analyzes using signal detection theory (Bartol et al., 2015).

The two methods were compared using dendritic spine head volumes in the middle of the stratum radiatum of hippocampal area CA1 as well-defined measures of synaptic strength.

Information theory outlined the number of distinguishable synaptic strengths based on non-overlapping bins of dendritic spine head volumes. Shannon entropy was applied to measure synaptic information storage capacity (SISC) and resulted in a lower bound of 4.1 bits and an upper bound of 4.59 bits of information based on 24 distinguishable sizes.

We further compared the distribution of distinguishable sizes and a uniform distribution using Kullback-Leibler divergence and found that there was a nearly uniform distribution of spinal head volumes across the sizes, indicating optimal use of the distinguishable values.

Thus, SISC provides a new analytical measure that can be generalized to investigate synaptic strengths and capacity for plasticity in different brain regions from different species and in animals raised in different conditions or during learning. How brain diseases and disorders affect the precision of synaptic plasticity can also be explored.

Leave a Comment