SNNAX -- Spiking Neural Networks in JAX

Jamie Lohoff, Jan Finkbeiner, Emre Neftci
{"title":"SNNAX -- Spiking Neural Networks in JAX","authors":"Jamie Lohoff, Jan Finkbeiner, Emre Neftci","doi":"arxiv-2409.02842","DOIUrl":null,"url":null,"abstract":"Spiking Neural Networks (SNNs) simulators are essential tools to prototype\nbiologically inspired models and neuromorphic hardware architectures and\npredict their performance. For such a tool, ease of use and flexibility are\ncritical, but so is simulation speed especially given the complexity inherent\nto simulating SNN. Here, we present SNNAX, a JAX-based framework for simulating\nand training such models with PyTorch-like intuitiveness and JAX-like execution\nspeed. SNNAX models are easily extended and customized to fit the desired model\nspecifications and target neuromorphic hardware. Additionally, SNNAX offers key\nfeatures for optimizing the training and deployment of SNNs such as flexible\nautomatic differentiation and just-in-time compilation. We evaluate and compare\nSNNAX to other commonly used machine learning (ML) frameworks used for\nprogramming SNNs. We provide key performance metrics, best practices,\ndocumented examples for simulating SNNs in SNNAX, and implement several\nbenchmarks used in the literature.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"28 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.02842","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Spiking Neural Networks (SNNs) simulators are essential tools to prototype biologically inspired models and neuromorphic hardware architectures and predict their performance. For such a tool, ease of use and flexibility are critical, but so is simulation speed especially given the complexity inherent to simulating SNN. Here, we present SNNAX, a JAX-based framework for simulating and training such models with PyTorch-like intuitiveness and JAX-like execution speed. SNNAX models are easily extended and customized to fit the desired model specifications and target neuromorphic hardware. Additionally, SNNAX offers key features for optimizing the training and deployment of SNNs such as flexible automatic differentiation and just-in-time compilation. We evaluate and compare SNNAX to other commonly used machine learning (ML) frameworks used for programming SNNs. We provide key performance metrics, best practices, documented examples for simulating SNNs in SNNAX, and implement several benchmarks used in the literature.
SNNAX -- JAX 中的尖峰神经网络
尖峰神经网络(SNN)模拟器是对受生物学启发的模型和神经形态硬件架构进行原型设计并预测其性能的重要工具。对于这样一种工具来说,易用性和灵活性至关重要,但仿真速度也同样重要,尤其是考虑到尖峰神经网络仿真固有的复杂性。在此,我们介绍 SNNAX,这是一个基于 JAX 的框架,用于模拟和训练此类模型,具有 PyTorch 的直观性和 JAX 的执行速度。SNNAX 模型可轻松扩展和定制,以适应所需的模型规格和目标神经形态硬件。此外,SNNAX 还提供了优化 SNN 训练和部署的关键功能,如灵活的自动区分和即时编译。我们评估了 SNNAX,并将其与用于编程 SNN 的其他常用机器学习(ML)框架进行了比较。我们提供了在 SNNAX 中模拟 SNN 的关键性能指标、最佳实践和文档示例,并实现了文献中使用的多个基准。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信