Spiking Diffusion Models

Jiahang Cao, Hanzhong Guo, Ziqing Wang, Deming Zhou, Hao Cheng, Qiang Zhang, Renjing Xu
{"title":"Spiking Diffusion Models","authors":"Jiahang Cao, Hanzhong Guo, Ziqing Wang, Deming Zhou, Hao Cheng, Qiang Zhang, Renjing Xu","doi":"arxiv-2408.16467","DOIUrl":null,"url":null,"abstract":"Recent years have witnessed Spiking Neural Networks (SNNs) gaining attention\nfor their ultra-low energy consumption and high biological plausibility\ncompared with traditional Artificial Neural Networks (ANNs). Despite their\ndistinguished properties, the application of SNNs in the computationally\nintensive field of image generation is still under exploration. In this paper,\nwe propose the Spiking Diffusion Models (SDMs), an innovative family of\nSNN-based generative models that excel in producing high-quality samples with\nsignificantly reduced energy consumption. In particular, we propose a\nTemporal-wise Spiking Mechanism (TSM) that allows SNNs to capture more temporal\nfeatures from a bio-plasticity perspective. In addition, we propose a\nthreshold-guided strategy that can further improve the performances by up to\n16.7% without any additional training. We also make the first attempt to use\nthe ANN-SNN approach for SNN-based generation tasks. Extensive experimental\nresults reveal that our approach not only exhibits comparable performance to\nits ANN counterpart with few spiking time steps, but also outperforms previous\nSNN-based generative models by a large margin. Moreover, we also demonstrate\nthe high-quality generation ability of SDM on large-scale datasets, e.g., LSUN\nbedroom. This development marks a pivotal advancement in the capabilities of\nSNN-based generation, paving the way for future research avenues to realize\nlow-energy and low-latency generative applications. Our code is available at\nhttps://github.com/AndyCao1125/SDM.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.16467","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Recent years have witnessed Spiking Neural Networks (SNNs) gaining attention for their ultra-low energy consumption and high biological plausibility compared with traditional Artificial Neural Networks (ANNs). Despite their distinguished properties, the application of SNNs in the computationally intensive field of image generation is still under exploration. In this paper, we propose the Spiking Diffusion Models (SDMs), an innovative family of SNN-based generative models that excel in producing high-quality samples with significantly reduced energy consumption. In particular, we propose a Temporal-wise Spiking Mechanism (TSM) that allows SNNs to capture more temporal features from a bio-plasticity perspective. In addition, we propose a threshold-guided strategy that can further improve the performances by up to 16.7% without any additional training. We also make the first attempt to use the ANN-SNN approach for SNN-based generation tasks. Extensive experimental results reveal that our approach not only exhibits comparable performance to its ANN counterpart with few spiking time steps, but also outperforms previous SNN-based generative models by a large margin. Moreover, we also demonstrate the high-quality generation ability of SDM on large-scale datasets, e.g., LSUN bedroom. This development marks a pivotal advancement in the capabilities of SNN-based generation, paving the way for future research avenues to realize low-energy and low-latency generative applications. Our code is available at https://github.com/AndyCao1125/SDM.
尖峰扩散模型
与传统的人工神经网络(ANN)相比,尖峰神经网络(SNN)具有超低能耗和高生物可信度的特点,因此近年来备受关注。尽管 SNNs 具有与众不同的特性,但其在计算密集型图像生成领域的应用仍处于探索阶段。在本文中,我们提出了尖峰扩散模型(SDMs),这是基于 SNN 的生成模型的一个创新系列,在生成高质量样本的同时能显著降低能耗。特别是,我们提出了一种时序性尖峰机制(TSM),它允许 SNNs 从生物可塑性的角度捕捉更多的时序特征。此外,我们还提出了阈值引导策略,无需额外训练即可进一步提高性能达 16.7%。我们还首次尝试将 ANN-SNN 方法用于基于 SNN 的生成任务。广泛的实验结果表明,我们的方法不仅在尖峰时间步数较少的情况下表现出与 ANN 类似的性能,而且在很大程度上优于之前基于 SNN 的生成模型。此外,我们还在 LSUNbedroom 等大规模数据集上证明了 SDM 的高质量生成能力。这一发展标志着基于 SNN 的生成能力取得了关键性的进步,为未来实现低能耗、低延迟生成应用的研究铺平了道路。我们的代码可在https://github.com/AndyCao1125/SDM。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信