{"title":"Noise and Dynamical Synapses as Optimization Tools for Spiking Neural Networks.","authors":"Yana Garipova, Shogo Yonekura, Yasuo Kuniyoshi","doi":"10.3390/e27030219","DOIUrl":null,"url":null,"abstract":"<p><p>Standard ANNs lack flexibility when handling corrupted input due to their fixed structure. In this paper, a spiking neural network utilizes biological temporal coding features in the form of noise-induced stochastic resonance and dynamical synapses to increase the model's performance when its parameters are not optimized for a given input. Using the analog XOR task as a simplified convolutional neural network model, this paper demonstrates two key results: (1) SNNs solve the problem that is linearly inseparable in ANN with fewer neurons, and (2) in leaky SNNs, the addition of noise and dynamical synapses compensate for non-optimal parameters, achieving near-optimal results for weaker inputs.</p>","PeriodicalId":11694,"journal":{"name":"Entropy","volume":"27 3","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2025-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11941097/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entropy","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.3390/e27030219","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Standard ANNs lack flexibility when handling corrupted input due to their fixed structure. In this paper, a spiking neural network utilizes biological temporal coding features in the form of noise-induced stochastic resonance and dynamical synapses to increase the model's performance when its parameters are not optimized for a given input. Using the analog XOR task as a simplified convolutional neural network model, this paper demonstrates two key results: (1) SNNs solve the problem that is linearly inseparable in ANN with fewer neurons, and (2) in leaky SNNs, the addition of noise and dynamical synapses compensate for non-optimal parameters, achieving near-optimal results for weaker inputs.
期刊介绍:
Entropy (ISSN 1099-4300), an international and interdisciplinary journal of entropy and information studies, publishes reviews, regular research papers and short notes. Our aim is to encourage scientists to publish as much as possible their theoretical and experimental details. There is no restriction on the length of the papers. If there are computation and the experiment, the details must be provided so that the results can be reproduced.