{"title":"利用 GABA 调制 STDP 稳定随机尖峰网络中的序列学习","authors":"Marius Vieth, Jochen Triesch","doi":"10.1016/j.neunet.2024.106985","DOIUrl":null,"url":null,"abstract":"<p><p>Cortical networks are capable of unsupervised learning and spontaneous replay of complex temporal sequences. Endowing artificial spiking neural networks with similar learning abilities remains a challenge. In particular, it is unresolved how different plasticity rules can contribute to both learning and the maintenance of network stability during learning. Here we introduce a biologically inspired form of GABA-Modulated Spike Timing-Dependent Plasticity (GMS) and demonstrate its ability to permit stable learning of complex temporal sequences including natural language in recurrent spiking neural networks. Motivated by biological findings, GMS utilizes the momentary level of inhibition onto excitatory cells to adjust both the magnitude and sign of Spike Timing-Dependent Plasticity (STDP) of connections between excitatory cells. In particular, high levels of inhibition in the network cause depression of excitatory-to-excitatory connections. We demonstrate the effectiveness of this mechanism during several sequence learning experiments with character- and token-based text inputs as well as visual input sequences. We show that GMS maintains stability during learning and spontaneous replay and permits the network to form a clustered hierarchical representation of its input sequences. Overall, we provide a biologically inspired model of unsupervised learning of complex sequences in recurrent spiking neural networks.</p>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"183 ","pages":"106985"},"PeriodicalIF":6.0000,"publicationDate":"2024-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Stabilizing sequence learning in stochastic spiking networks with GABA-Modulated STDP.\",\"authors\":\"Marius Vieth, Jochen Triesch\",\"doi\":\"10.1016/j.neunet.2024.106985\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Cortical networks are capable of unsupervised learning and spontaneous replay of complex temporal sequences. Endowing artificial spiking neural networks with similar learning abilities remains a challenge. In particular, it is unresolved how different plasticity rules can contribute to both learning and the maintenance of network stability during learning. Here we introduce a biologically inspired form of GABA-Modulated Spike Timing-Dependent Plasticity (GMS) and demonstrate its ability to permit stable learning of complex temporal sequences including natural language in recurrent spiking neural networks. Motivated by biological findings, GMS utilizes the momentary level of inhibition onto excitatory cells to adjust both the magnitude and sign of Spike Timing-Dependent Plasticity (STDP) of connections between excitatory cells. In particular, high levels of inhibition in the network cause depression of excitatory-to-excitatory connections. We demonstrate the effectiveness of this mechanism during several sequence learning experiments with character- and token-based text inputs as well as visual input sequences. We show that GMS maintains stability during learning and spontaneous replay and permits the network to form a clustered hierarchical representation of its input sequences. Overall, we provide a biologically inspired model of unsupervised learning of complex sequences in recurrent spiking neural networks.</p>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"183 \",\"pages\":\"106985\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2024-12-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1016/j.neunet.2024.106985\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1016/j.neunet.2024.106985","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Stabilizing sequence learning in stochastic spiking networks with GABA-Modulated STDP.
Cortical networks are capable of unsupervised learning and spontaneous replay of complex temporal sequences. Endowing artificial spiking neural networks with similar learning abilities remains a challenge. In particular, it is unresolved how different plasticity rules can contribute to both learning and the maintenance of network stability during learning. Here we introduce a biologically inspired form of GABA-Modulated Spike Timing-Dependent Plasticity (GMS) and demonstrate its ability to permit stable learning of complex temporal sequences including natural language in recurrent spiking neural networks. Motivated by biological findings, GMS utilizes the momentary level of inhibition onto excitatory cells to adjust both the magnitude and sign of Spike Timing-Dependent Plasticity (STDP) of connections between excitatory cells. In particular, high levels of inhibition in the network cause depression of excitatory-to-excitatory connections. We demonstrate the effectiveness of this mechanism during several sequence learning experiments with character- and token-based text inputs as well as visual input sequences. We show that GMS maintains stability during learning and spontaneous replay and permits the network to form a clustered hierarchical representation of its input sequences. Overall, we provide a biologically inspired model of unsupervised learning of complex sequences in recurrent spiking neural networks.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.