A scalable implementation of the recursive least-squares algorithm for training spiking neural networks.

IF 2.5 4区 医学 Q2 MATHEMATICAL & COMPUTATIONAL BIOLOGY
Frontiers in Neuroinformatics Pub Date : 2023-06-27 eCollection Date: 2023-01-01 DOI:10.3389/fninf.2023.1099510
Benjamin J Arthur, Christopher M Kim, Susu Chen, Stephan Preibisch, Ran Darshan
{"title":"A scalable implementation of the recursive least-squares algorithm for training spiking neural networks.","authors":"Benjamin J Arthur, Christopher M Kim, Susu Chen, Stephan Preibisch, Ran Darshan","doi":"10.3389/fninf.2023.1099510","DOIUrl":null,"url":null,"abstract":"<p><p>Training spiking recurrent neural networks on neuronal recordings or behavioral tasks has become a popular way to study computations performed by the nervous system. As the size and complexity of neural recordings increase, there is a need for efficient algorithms that can train models in a short period of time using minimal resources. We present optimized CPU and GPU implementations of the recursive least-squares algorithm in spiking neural networks. The GPU implementation can train networks of one million neurons, with 100 million plastic synapses and a billion static synapses, about 1,000 times faster than an unoptimized reference CPU implementation. We demonstrate the code's utility by training a network, in less than an hour, to reproduce the activity of > 66, 000 recorded neurons of a mouse performing a decision-making task. The fast implementation enables a more interactive <i>in-silico</i> study of the dynamics and connectivity underlying multi-area computations. It also admits the possibility to train models as <i>in-vivo</i> experiments are being conducted, thus closing the loop between modeling and experiments.</p>","PeriodicalId":12462,"journal":{"name":"Frontiers in Neuroinformatics","volume":"17 ","pages":"1099510"},"PeriodicalIF":2.5000,"publicationDate":"2023-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10333503/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Neuroinformatics","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fninf.2023.1099510","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Training spiking recurrent neural networks on neuronal recordings or behavioral tasks has become a popular way to study computations performed by the nervous system. As the size and complexity of neural recordings increase, there is a need for efficient algorithms that can train models in a short period of time using minimal resources. We present optimized CPU and GPU implementations of the recursive least-squares algorithm in spiking neural networks. The GPU implementation can train networks of one million neurons, with 100 million plastic synapses and a billion static synapses, about 1,000 times faster than an unoptimized reference CPU implementation. We demonstrate the code's utility by training a network, in less than an hour, to reproduce the activity of > 66, 000 recorded neurons of a mouse performing a decision-making task. The fast implementation enables a more interactive in-silico study of the dynamics and connectivity underlying multi-area computations. It also admits the possibility to train models as in-vivo experiments are being conducted, thus closing the loop between modeling and experiments.

Abstract Image

Abstract Image

用于训练尖峰神经网络的递归最小二乘算法的可扩展实现。
根据神经元记录或行为任务训练尖峰递归神经网络已成为研究神经系统计算的一种常用方法。随着神经记录的大小和复杂性的增加,我们需要能在短时间内利用最少资源训练模型的高效算法。我们介绍了尖峰神经网络中递归最小二乘算法的 CPU 和 GPU 优化实现。GPU 实现可以训练包含一百万个神经元、一亿个可塑性突触和十亿个静态突触的网络,比未经优化的 CPU 参考实现快约 1000 倍。我们在不到一小时的时间内就训练出了一个网络,重现了小鼠在执行决策任务时记录的 66000 个神经元的活动,从而证明了代码的实用性。这种快速的实现方式可以对多区域计算的动态和连通性进行更加互动的内部研究。它还可以在进行体内实验的同时训练模型,从而实现建模与实验之间的闭环。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Frontiers in Neuroinformatics
Frontiers in Neuroinformatics MATHEMATICAL & COMPUTATIONAL BIOLOGY-NEUROSCIENCES
CiteScore
4.80
自引率
5.70%
发文量
132
审稿时长
14 weeks
期刊介绍: Frontiers in Neuroinformatics publishes rigorously peer-reviewed research on the development and implementation of numerical/computational models and analytical tools used to share, integrate and analyze experimental data and advance theories of the nervous system functions. Specialty Chief Editors Jan G. Bjaalie at the University of Oslo and Sean L. Hill at the École Polytechnique Fédérale de Lausanne are supported by an outstanding Editorial Board of international experts. This multidisciplinary open-access journal is at the forefront of disseminating and communicating scientific knowledge and impactful discoveries to researchers, academics and the public worldwide. Neuroscience is being propelled into the information age as the volume of information explodes, demanding organization and synthesis. Novel synthesis approaches are opening up a new dimension for the exploration of the components of brain elements and systems and the vast number of variables that underlie their functions. Neural data is highly heterogeneous with complex inter-relations across multiple levels, driving the need for innovative organizing and synthesizing approaches from genes to cognition, and covering a range of species and disease states. Frontiers in Neuroinformatics therefore welcomes submissions on existing neuroscience databases, development of data and knowledge bases for all levels of neuroscience, applications and technologies that can facilitate data sharing (interoperability, formats, terminologies, and ontologies), and novel tools for data acquisition, analyses, visualization, and dissemination of nervous system data. Our journal welcomes submissions on new tools (software and hardware) that support brain modeling, and the merging of neuroscience databases with brain models used for simulation and visualization.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信