易失性记忆装置在神经形态学习结构中的短期记忆作用

Jens Bürger, C. Teuscher
{"title":"易失性记忆装置在神经形态学习结构中的短期记忆作用","authors":"Jens Bürger, C. Teuscher","doi":"10.1145/2770287.2770313","DOIUrl":null,"url":null,"abstract":"Image classification with feed-forward neural networks typically assumes the application of input images as single column vectors, which leads to a large number of required input neurons as well as large synaptic arrays connecting individual neural layers. In this paper we show how a class of memristive devices can be used as non-linear, leaky integrators that extend regular feed-forward neural networks with short-term memory. By trading space for time, our novel architecture allows to reduce the number of neurons by a factor of 3 and the number of synapses up to 15 times on the MNIST data set compared to previously reported results. Furthermore, the results indicate that less neurons and synapses also leads to a reduced learning complexity. With memristive devices functioning as dynamic processing elements, our findings advocate for a diverse use of memristive devices that would allow to build more area-efficient hardware by exploiting more than just their non-volatile memory property.","PeriodicalId":6519,"journal":{"name":"2014 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH)","volume":"33 1","pages":"104-109"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Volatile memristive devices as short-term memory in a neuromorphic learning architecture\",\"authors\":\"Jens Bürger, C. Teuscher\",\"doi\":\"10.1145/2770287.2770313\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Image classification with feed-forward neural networks typically assumes the application of input images as single column vectors, which leads to a large number of required input neurons as well as large synaptic arrays connecting individual neural layers. In this paper we show how a class of memristive devices can be used as non-linear, leaky integrators that extend regular feed-forward neural networks with short-term memory. By trading space for time, our novel architecture allows to reduce the number of neurons by a factor of 3 and the number of synapses up to 15 times on the MNIST data set compared to previously reported results. Furthermore, the results indicate that less neurons and synapses also leads to a reduced learning complexity. With memristive devices functioning as dynamic processing elements, our findings advocate for a diverse use of memristive devices that would allow to build more area-efficient hardware by exploiting more than just their non-volatile memory property.\",\"PeriodicalId\":6519,\"journal\":{\"name\":\"2014 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH)\",\"volume\":\"33 1\",\"pages\":\"104-109\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-07-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2770287.2770313\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2770287.2770313","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

前馈神经网络的图像分类通常假设输入图像作为单列向量的应用,这导致需要大量的输入神经元以及连接各个神经层的大型突触阵列。在本文中,我们展示了一类记忆器件如何作为非线性、泄漏积分器来扩展具有短期记忆的规则前馈神经网络。与之前报道的结果相比,我们的新架构可以将MNIST数据集上的神经元数量减少3倍,突触数量减少15倍。此外,研究结果表明,神经元和突触的减少也会导致学习复杂性的降低。由于忆阻器件作为动态处理元件的功能,我们的研究结果提倡对忆阻器件的多样化使用,这将允许通过利用其非易失性存储特性来构建更高效的面积硬件。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Volatile memristive devices as short-term memory in a neuromorphic learning architecture
Image classification with feed-forward neural networks typically assumes the application of input images as single column vectors, which leads to a large number of required input neurons as well as large synaptic arrays connecting individual neural layers. In this paper we show how a class of memristive devices can be used as non-linear, leaky integrators that extend regular feed-forward neural networks with short-term memory. By trading space for time, our novel architecture allows to reduce the number of neurons by a factor of 3 and the number of synapses up to 15 times on the MNIST data set compared to previously reported results. Furthermore, the results indicate that less neurons and synapses also leads to a reduced learning complexity. With memristive devices functioning as dynamic processing elements, our findings advocate for a diverse use of memristive devices that would allow to build more area-efficient hardware by exploiting more than just their non-volatile memory property.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信