Home / Chroniques / Understanding short-term memory through neuronal plasticity
Artistic depiction of a brain with gears and cogs inside, representing the workings of advertising strategies and campaigns
π Neuroscience

Understanding short-term memory through neuronal plasticity

David Clark
David Clark
PhD Student in Neurobiology and Behavior at Columbia University
Key takeaways
  • Synapses, not neurons, play the main role in working memory.
  • To simplify the analysis of neural networks, early studies considered neurons to be ‘fixed’, thereby obscuring synaptic plasticity.
  • Researchers at Columbia University updated the theory by including synaptic and neuronal dynamics.
  • They discovered that synaptic dynamics can modulate the overall behaviour of neural networks, speeding up or slowing down neuronal activity.
  • A new behaviour, called ‘frozen chaos’, was identified, where synapses create fixed patterns of neuronal activity, potentially crucial for working memory.
  • There is still room for improvement in this model: neuroscientists now want to incorporate certain biological properties of the brain to make it more realistic.

What role do neur­ons and syn­apses play in work­ing memory? This is a ques­tion that neur­os­cient­ists have long pondered. Until now, it was thought that neur­on­al activ­ity dom­in­ated, with syn­apses only involved in the slower pro­cesses of learn­ing and memory. But research­ers at Columbia Uni­ver­sity have now developed a new the­or­et­ic­al frame­work that pre­dicts that syn­apses rather than neur­ons are more import­ant. Their new mod­el might lead us to an altern­at­ive mech­an­ism for work­ing memory in the brain, they say.

The human brain is made up of around 100 bil­lion neur­ons. Each neur­on receives elec­tric­al sig­nals from oth­er neur­ons via thou­sands of tiny con­nec­tions called syn­apses. When the sum of the sig­nals emit­ted by the syn­apses exceeds a cer­tain threshold, a neur­on “fires” by send­ing a series of voltage spikes to a large num­ber of oth­er neur­ons. Neur­ons are there­fore “excit­able”: below a cer­tain input threshold, the out­put of the sys­tem is very small and lin­ear, but above the threshold it becomes large and non-linear.

The strength of inter­ac­tions between neur­ons can also change over time. This pro­cess, known as syn­aptic plas­ti­city, is thought to play a cru­cial role in learning.

With and without plasticity

To sim­pli­fy things, early stud­ies in this field con­sidered that neur­on­al net­works were non- plastic. They assumed that syn­aptic con­nectiv­ity was fixed, and research­ers ana­lysed how this con­nectiv­ity shaped the col­lect­ive activ­ity of neur­ons. Although not real­ist­ic, this approach has enabled us to under­stand the basic prin­ciples of these net­works and how they function.

Dav­id Clark, a doc­tor­al stu­dent in neuro­bi­o­logy and beha­viour at Columbia Uni­ver­sity, and Larry Abbott, his thes­is super­visor, have now exten­ded this mod­el to plastic syn­apses. This makes the sys­tem more com­plex – and more real­ist­ic – because neur­on­al activ­ity can now dynam­ic­ally shape the con­nectiv­ity between synapses.

The research­ers used a math­em­at­ic­al tool known as dynam­ic mean field the­ory to reduce the “high-dimen­sion­al” net­work equa­tions of the ori­gin­al mod­el to a “low-dimen­sion­al” stat­ist­ic­al descrip­tion. In short, they mod­i­fied the the­ory to include syn­aptic and neur­on­al dynam­ics. This allowed them to devel­op a sim­pler mod­el that incor­por­ates many of the import­ant factors involved in plastic neur­al net­works. “The main chal­lenge was to cap­ture all the dynam­ics of neur­ons and syn­apses while main­tain­ing an ana­lyt­ic­ally solv­able mod­el,” explains Dav­id Clark.

Synaptic dynamics become important

The research­ers found that when syn­aptic dynam­ics and neur­on­al dynam­ics occur on a sim­il­ar time scale, syn­aptic dynam­ics become import­ant in shap­ing the over­all beha­viour of a neur­al net­work. Their ana­lyses also showed that syn­aptic dynam­ics can speed up or slow down neur­on­al dynam­ics and there­fore rein­force or sup­press the chaot­ic activ­ity of neurons.

Above all, they dis­covered a new type of beha­viour that appears when syn­apses gen­er­ate fixed pat­terns of neur­on­al activ­ity in net­works. These pat­terns appear when plas­ti­city is moment­ar­ily deac­tiv­ated, which has the effect of “freez­ing” the states of the neur­ons. This “frozen chaos”, as the research­ers call it, can help to store inform­a­tion in the brain and is like the way work­ing memory works.

The sci­entif­ic chal­lenge of our study was to trans­late this intu­ition into equa­tions and results.

“This research top­ic came about when Larry Abbot raised the idea, while chat­ting in his office, that dynam­ic syn­apses play just as import­ant a role in neur­on­al com­pu­ta­tion as neur­ons them­selves,” explains Dav­id Clark. “I found this idea very inter­est­ing, because it flips the typ­ic­al view of neur­ons as the dynam­ic units, with syn­apses only being involved in the slower learn­ing and memory pro­cesses. The sci­entif­ic chal­lenge in our study was to trans­late this intu­ition into equa­tions and results.”

“The new mod­el provides a pos­sible new mech­an­ism for work­ing memory,” he adds. “More gen­er­ally, we now have a solv­able mod­el of coupled neur­on­al and syn­aptic dynam­ics that could be exten­ded, for example, to mod­el­ling how short-term memory is con­sol­id­ated into long-term memory.”

Dav­id Clark and Larry Abbott now hope to make their mod­el more real­ist­ic by incor­por­at­ing cer­tain bio­lo­gic­al prop­er­ties of the brain, includ­ing that neur­ons com­mu­nic­ate via dis­crete voltage spikes. Oth­er import­ant fea­tures, such as the fact that neur­ons are pat­terned in spe­cific­ally struc­tured con­nec­tions, will also have to be taken into account, they add.

Isabelle Dumé

Ref­er­ence: D. G. Clark and L. F. Abbott, “The­ory of coupled neur­on­al-syn­aptic dynam­ics,” Phys. Rev. X 14, 021001 (2024).

Support accurate information rooted in the scientific method.

Donate