For decades, scientists believed neurons were the brain’s sole architects of thought and memory — but now, new research suggests that another, often-overlooked type of brain cell may play a more central role in memory than previously thought.
The study, published in May in the journal PNAS, proposes that these other brain cells, called astrocytes, could be responsible for the brain’s impressive memory-storage capacity through a newly discovered kind of network architecture.
Astrocytes are star-shaped cells that perform many maintenance tasks in the brain, including clearing cellular debris, supplying neurons with nutrients and regulating blood flow. They also sport thin branching structures, known as processes, that wrap around the points where neurons exchange messages. This wrapping forms what is called a tripartite synapse, a kind of three-way handshake involving the two connected neurons and the astrocyte.
“You can imagine an astrocyte as an octopus with millions of tentacles,” said lead author Leo Kozachkov, who was a PhD student at MIT at the time the study was conducted and is now a postdoctoral fellow at IBM Research in Yorktown Heights, New York. “The head of the octopus is the cell body, and the tentacles are ‘processes’ that wrap around nearby synapses,” Kozachkov told Live Science in an email.
Astrocytes don’t transmit electrical impulses like neurons do. Instead, they communicate via calcium signaling, sending waves of charged calcium particles within and between cells. Studies have shown that astrocytes respond to synaptic activity by altering their internal calcium levels. These changes can then trigger the release of chemical messengers from the astrocyte into the synapse.
“These processes act as tiny calcium computers, sensing when information is sent through the synapse, passing that information to other processes, and then receiving feedback in return,” Kozachkov said. Ultimately, this chain email gets back to the neurons, which adjust their activity in turn. However, researchers don’t yet fully understand the precise computational functions astrocytes perform with the information they receive from neurons.
Related: The brain stores at least 3 copies of every memory
To better understand this function, Kozachkov and his colleagues turned to machine learning architectures that are capable of representing complex interactions between many actors, rather than capturing only simple connections between pairs of units.
Traditional machine learning networks that link only pairs of neurons might encode limited information, said senior study author Dmitry Krotov, a research staff member at the MIT-IBM Watson AI Lab and IBM Research. Because a single astrocyte could connect to thousands of synapses, the team hypothesized that astrocytes might mediate communication across all of these connections. That could explain how the brain achieves its massive storage capabilities, they proposed.
“The unique anatomical structure of astrocytes provides a very natural and tempting way to design these large information storage systems in biological hardware,” Kozachkov told Live Science in an email.
The researchers also hypothesized that astrocytes store memories through gradual changes in their internal calcium patterns and that these patterns are then translated back into signals that get sent to neurons in the form of chemical messengers. In this model, each astrocyte process, rather than the whole cell, functions as a distinct computational unit, the team proposed.
“Our model does not need a lot of neurons to store a lot of memories,” Kozachkov said. “This is a significant advantage from an energy efficiency perspective, since neurons are metabolically ‘expensive.'”
The model offers a “biologically grounded explanation” for how these memory storage systems might operate in the brain, said Maurizio de Pittà, an assistant professor at the Krembil Research Institute in Toronto, Canada, who was not involved in the work. Past studies with high-resolution microscopes have supported this view, showing that astrocyte processes are interwoven throughout the brain and make contact with multiple synapses.
However, de Pittà told Live Science in an email that “models are powerful tools, but they remain approximations of the real world.” He also cautioned that current technologies can not yet fully capture the dynamics unfolding in the human brain in real time, and that level of detail would be needed to validate the hypothesis.
Although scientists are starting to realize that astrocytes play a role in how we form memories, de Pittà said, we still don’t have clear proof that the specific, calcium-based interactions between these cells and brain actually help create, store or recall memories, as suggested by the MIT team. If the team’s model proves correct, though, the implications could offer a new way to think about brain storage, suggesting that memory capacity could scale with the number of astrocyte-synapse interactions present in the brain.
The model also offers potential therapeutic targets for neurodegenerative diseases, the study authors said.
“Astrocytes are known to be implicated in Alzheimer’s and other memory disorders: our model provides a computational view of what might be going wrong,” Kozachkov said. “Potentially, our mathematical model may inspire the search for new therapeutic targets: precise modulation of astrocyte process connectivity or signaling could restore or compensate for lost memory function.”
However, much more research would be needed for this work to be translated into clinical treatments.
Beyond neuroscience, the model may point to applications in artificial intelligence. The model could help researchers create brain-like hardware systems, de Pittà said. Such systems could use dense memory architectures that enable them to store huge amounts of information and recall it efficiently, using very little energy, just like our brains do. This could be used for a wide array of applications, such as voice recognition; robotics and autonomous systems; AI assistants; or brain-machine interfaces and “neuroprosthetics.”