Lyapunov theory demonstrating a fundamental limit on the speed of systems consolidation.

IF 4.2
Physical review research Pub Date : 2025-04-01 Epub Date: 2025-05-21 DOI:10.1103/physrevresearch.7.023174
Alireza Alemi, Emre R F Aksay, Mark S Goldman
{"title":"Lyapunov theory demonstrating a fundamental limit on the speed of systems consolidation.","authors":"Alireza Alemi, Emre R F Aksay, Mark S Goldman","doi":"10.1103/physrevresearch.7.023174","DOIUrl":null,"url":null,"abstract":"<p><p>The nervous system reorganizes memories from an early site to a late site, a commonly observed feature of learning and memory systems known as systems consolidation. Previous work has suggested learning rules by which consolidation may occur. Here, we provide conditions under which such rules are guaranteed to lead to stable convergence of learning and consolidation. We use the theory of Lyapunov functions, which enforces stability by requiring learning rules to decrease an energy-like (Lyapunov) function. We present the theory in the context of a simple circuit architecture motivated by classic models of cerebellum-mediated learning and consolidation. Stability is only guaranteed if the learning rate in the late stage is not faster than the learning rate in the early stage. Further, the slower the learning rate at the late stage, the larger the perturbation the system can tolerate with a guarantee of stability. We provide intuition for this result by mapping a simple example consolidation model to a damped driven oscillator system and showing that the ratio of early- to late-stage learning rates in the consolidation model can be directly identified with the oscillator's damping ratio. We then apply the theory to modeling the tuning by the cerebellum of a well-characterized analog short-term memory system, the oculomotor neural integrator, and find similar stability conditions. This work suggests the power of the Lyapunov approach to provide constraints on nervous system function.</p>","PeriodicalId":520315,"journal":{"name":"Physical review research","volume":"7 2","pages":""},"PeriodicalIF":4.2000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12392100/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Physical review research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1103/physrevresearch.7.023174","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/5/21 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The nervous system reorganizes memories from an early site to a late site, a commonly observed feature of learning and memory systems known as systems consolidation. Previous work has suggested learning rules by which consolidation may occur. Here, we provide conditions under which such rules are guaranteed to lead to stable convergence of learning and consolidation. We use the theory of Lyapunov functions, which enforces stability by requiring learning rules to decrease an energy-like (Lyapunov) function. We present the theory in the context of a simple circuit architecture motivated by classic models of cerebellum-mediated learning and consolidation. Stability is only guaranteed if the learning rate in the late stage is not faster than the learning rate in the early stage. Further, the slower the learning rate at the late stage, the larger the perturbation the system can tolerate with a guarantee of stability. We provide intuition for this result by mapping a simple example consolidation model to a damped driven oscillator system and showing that the ratio of early- to late-stage learning rates in the consolidation model can be directly identified with the oscillator's damping ratio. We then apply the theory to modeling the tuning by the cerebellum of a well-characterized analog short-term memory system, the oculomotor neural integrator, and find similar stability conditions. This work suggests the power of the Lyapunov approach to provide constraints on nervous system function.

Abstract Image

Abstract Image

Abstract Image

李亚普诺夫理论证明了系统整合速度的基本限制。
神经系统将记忆从早期的位置重新组织到后期的位置,这是学习和记忆系统的一个常见特征,称为系统巩固。先前的研究已经提出了巩固可能发生的学习规则。在这里,我们提供了保证这些规则导致学习和巩固稳定收敛的条件。我们使用Lyapunov函数理论,它通过要求学习规则来减少类能(Lyapunov)函数来增强稳定性。我们在小脑介导的学习和巩固的经典模型驱动的简单电路架构的背景下提出了这一理论。只有当后期的学习率不高于早期的学习率时,才能保证稳定性。此外,后期的学习率越慢,系统在保证稳定性的情况下所能承受的扰动越大。我们通过将一个简单的固结模型映射到一个阻尼驱动的振荡器系统来直观地说明这一结果,并表明固结模型中早期和后期学习率的比率可以直接与振荡器的阻尼比相识别。然后,我们将该理论应用于模拟短期记忆系统的小脑调节模型,即动眼神经积分器,并找到了类似的稳定性条件。这项工作表明了李亚普诺夫方法在提供神经系统功能约束方面的力量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信