Kristofor D. Carlson, J. Nageswaran, N. Dutt, J. Krichmar
{"title":"Design space exploration and parameter tuning for neuromorphic applications","authors":"Kristofor D. Carlson, J. Nageswaran, N. Dutt, J. Krichmar","doi":"10.5555/2555692.2555712","DOIUrl":null,"url":null,"abstract":"Large-scale spiking neural networks (SNNs) have been used to successfully model complex neural circuits that explore various neural phenomena such as learning and memory, vision systems, auditory systems, neural oscillations, and many other important topics of neural function. Additionally, SNNs are particularly well-adapted to run on neuromorphic hardware as spiking events are often sparse, leading to a potentially large reduction in both bandwidth requirements and power usage. The inclusion of realistic plasticity equations, neural dynamics, and recurrent topologies has increased the descriptive power of SNNs but has also made the task of tuning these biologically realistic SNNs difficult. We present an automated parameter-tuning framework capable of tuning large-scale SNNs quickly and efficiently using evolutionary algorithms (EA) and off-the-shelf graphics processing units (GPUs).","PeriodicalId":163484,"journal":{"name":"2013 International Conference on Hardware/Software Codesign and System Synthesis (CODES+ISSS)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 International Conference on Hardware/Software Codesign and System Synthesis (CODES+ISSS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5555/2555692.2555712","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Large-scale spiking neural networks (SNNs) have been used to successfully model complex neural circuits that explore various neural phenomena such as learning and memory, vision systems, auditory systems, neural oscillations, and many other important topics of neural function. Additionally, SNNs are particularly well-adapted to run on neuromorphic hardware as spiking events are often sparse, leading to a potentially large reduction in both bandwidth requirements and power usage. The inclusion of realistic plasticity equations, neural dynamics, and recurrent topologies has increased the descriptive power of SNNs but has also made the task of tuning these biologically realistic SNNs difficult. We present an automated parameter-tuning framework capable of tuning large-scale SNNs quickly and efficiently using evolutionary algorithms (EA) and off-the-shelf graphics processing units (GPUs).