{"title":"Volatile memristive devices as short-term memory in a neuromorphic learning architecture","authors":"Jens Bürger, C. Teuscher","doi":"10.1145/2770287.2770313","DOIUrl":null,"url":null,"abstract":"Image classification with feed-forward neural networks typically assumes the application of input images as single column vectors, which leads to a large number of required input neurons as well as large synaptic arrays connecting individual neural layers. In this paper we show how a class of memristive devices can be used as non-linear, leaky integrators that extend regular feed-forward neural networks with short-term memory. By trading space for time, our novel architecture allows to reduce the number of neurons by a factor of 3 and the number of synapses up to 15 times on the MNIST data set compared to previously reported results. Furthermore, the results indicate that less neurons and synapses also leads to a reduced learning complexity. With memristive devices functioning as dynamic processing elements, our findings advocate for a diverse use of memristive devices that would allow to build more area-efficient hardware by exploiting more than just their non-volatile memory property.","PeriodicalId":6519,"journal":{"name":"2014 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH)","volume":"33 1","pages":"104-109"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2770287.2770313","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
Image classification with feed-forward neural networks typically assumes the application of input images as single column vectors, which leads to a large number of required input neurons as well as large synaptic arrays connecting individual neural layers. In this paper we show how a class of memristive devices can be used as non-linear, leaky integrators that extend regular feed-forward neural networks with short-term memory. By trading space for time, our novel architecture allows to reduce the number of neurons by a factor of 3 and the number of synapses up to 15 times on the MNIST data set compared to previously reported results. Furthermore, the results indicate that less neurons and synapses also leads to a reduced learning complexity. With memristive devices functioning as dynamic processing elements, our findings advocate for a diverse use of memristive devices that would allow to build more area-efficient hardware by exploiting more than just their non-volatile memory property.