{"title":"Theoretical understanding of gradients of spike functions as boolean functions","authors":"DongHyung Yoo, Doo Seok Jeong","doi":"10.1007/s40747-024-01607-9","DOIUrl":null,"url":null,"abstract":"<p>Applying an error-backpropagation algorithm to spiking neural networks frequently needs to employ fictive derivatives of spike functions (popularly referred to as surrogate gradients) because the spike function is considered non-differentiable. The non-differentiability comes into play given that the spike function is viewed as a numeric function, most popularly, the Heaviside step function of membrane potential. To get back to basics, the spike function is not a numeric but a Boolean function that outputs <i>True</i> or <i>False</i> upon the comparison of the current potential and threshold. In this regard, we propose a method to evaluate the gradient of spike function viewed as a Boolean function for fixed- and floating-point data formats. For both formats, the gradient is considerably similar to a delta function that peaks at the threshold for spiking, which justifies the approximation of the spike function to the Heaviside step function. Unfortunately, the error-backpropagation algorithm with this gradient function fails to outperform popularly employed surrogate gradients, which may arise from the narrow peak of the gradient function and consequent potential undershoot and overshoot around the spiking threshold with coarse timesteps. We provide theoretical grounds of this hypothesis.</p>","PeriodicalId":10524,"journal":{"name":"Complex & Intelligent Systems","volume":"16 1","pages":""},"PeriodicalIF":5.0000,"publicationDate":"2024-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Complex & Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s40747-024-01607-9","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Applying an error-backpropagation algorithm to spiking neural networks frequently needs to employ fictive derivatives of spike functions (popularly referred to as surrogate gradients) because the spike function is considered non-differentiable. The non-differentiability comes into play given that the spike function is viewed as a numeric function, most popularly, the Heaviside step function of membrane potential. To get back to basics, the spike function is not a numeric but a Boolean function that outputs True or False upon the comparison of the current potential and threshold. In this regard, we propose a method to evaluate the gradient of spike function viewed as a Boolean function for fixed- and floating-point data formats. For both formats, the gradient is considerably similar to a delta function that peaks at the threshold for spiking, which justifies the approximation of the spike function to the Heaviside step function. Unfortunately, the error-backpropagation algorithm with this gradient function fails to outperform popularly employed surrogate gradients, which may arise from the narrow peak of the gradient function and consequent potential undershoot and overshoot around the spiking threshold with coarse timesteps. We provide theoretical grounds of this hypothesis.
期刊介绍:
Complex & Intelligent Systems aims to provide a forum for presenting and discussing novel approaches, tools and techniques meant for attaining a cross-fertilization between the broad fields of complex systems, computational simulation, and intelligent analytics and visualization. The transdisciplinary research that the journal focuses on will expand the boundaries of our understanding by investigating the principles and processes that underlie many of the most profound problems facing society today.