Mapping spatio-temporally encoded patterns by reward-modulated STDP in Spiking neurons

Ibrahim Ozturk, D. Halliday
{"title":"Mapping spatio-temporally encoded patterns by reward-modulated STDP in Spiking neurons","authors":"Ibrahim Ozturk, D. Halliday","doi":"10.1109/SSCI.2016.7850248","DOIUrl":null,"url":null,"abstract":"In this paper, a simple structure of two-layer feed-forward spiking neural network (SNN) is developed which is trained by reward-modulated Spike Timing Dependent Plasticity (STDP). Neurons based on leaky integrate-and-fire (LIF) neuron model are trained to associate input temporal sequences with a desired output spike pattern, both consisting of multiple spikes. A biologically plausible Reward-Modulated STDP learning rule is used so that the network can efficiently converge optimal spike generation. The relative timing of pre- and postsynaptic firings can only modify synaptic weights once the reward has occurred. The history of Hebbian events are stored in the synaptic eligibility traces. STDP process are applied to all synapses with different delays. We experimentally demonstrate a benchmark with spatio-temporally encoded spike pairs. Results demonstrate successful transformations with high accuracy and quick convergence during learning cycles. Therefore, the proposed SNN architecture with modulated STDP can learn how to map temporally encoded spike trains based on Poisson processes in a stable manner.","PeriodicalId":120288,"journal":{"name":"2016 IEEE Symposium Series on Computational Intelligence (SSCI)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2016-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE Symposium Series on Computational Intelligence (SSCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSCI.2016.7850248","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

In this paper, a simple structure of two-layer feed-forward spiking neural network (SNN) is developed which is trained by reward-modulated Spike Timing Dependent Plasticity (STDP). Neurons based on leaky integrate-and-fire (LIF) neuron model are trained to associate input temporal sequences with a desired output spike pattern, both consisting of multiple spikes. A biologically plausible Reward-Modulated STDP learning rule is used so that the network can efficiently converge optimal spike generation. The relative timing of pre- and postsynaptic firings can only modify synaptic weights once the reward has occurred. The history of Hebbian events are stored in the synaptic eligibility traces. STDP process are applied to all synapses with different delays. We experimentally demonstrate a benchmark with spatio-temporally encoded spike pairs. Results demonstrate successful transformations with high accuracy and quick convergence during learning cycles. Therefore, the proposed SNN architecture with modulated STDP can learn how to map temporally encoded spike trains based on Poisson processes in a stable manner.
通过奖励调制的STDP在尖峰神经元中映射时空编码模式
本文提出了一种简单的两层前馈尖峰神经网络(SNN)结构,该网络采用奖励调制尖峰时序相关可塑性(STDP)进行训练。基于LIF (leaky integrate-and-fire)神经元模型的神经元被训练将输入时间序列与期望的输出尖峰模式相关联,两者都由多个尖峰组成。采用生物学上合理的奖励调制STDP学习规则,使网络能够有效收敛最优尖峰生成。突触前和突触后放电的相对时间只有在奖赏发生后才能改变突触的权重。Hebbian事件的历史存储在突触合格性痕迹中。STDP过程应用于所有不同延迟的突触。我们通过实验证明了一个具有时空编码尖峰对的基准。结果表明,在学习周期中,成功的转换具有高精度和快速收敛性。因此,所提出的具有调制STDP的SNN体系结构可以学习如何以稳定的方式映射基于泊松过程的临时编码尖峰序列。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信