{"title":"用于神经形态计算的高能效二维模拟存储器","authors":"M. Sharbati, Yanhao Du, Feng Xiong","doi":"10.1109/DRC.2018.8442140","DOIUrl":null,"url":null,"abstract":"Unlike modern computers that use digital ‘0’ and ‘1’ for computation, neural networks in human brains exhibit analog changes in neural connections (i.e. synaptic weight) during the decision-making and learning processes. This analog nature as well as the neural network's massive parallelism are partly why human brains (~20 W) are much better at complex tasks such as pattern recognition than even the most powerful computers (~1 MW) with significantly better energy efficiency. Currently, majority of the research efforts towards developing artificial neural networks are based on digital technology with CMOS devices [1], which cannot mimic the analog behaviors of biological synapses and thus energy-extensive. Recently, emerging memory devices such as phase change memory (PCM), resistive random access memory (RRAM), and spin-torque transfer (STT) RAM [2]–[4] have been studied to mimic synaptic connections with their programmable conductance. While these approaches are promising, they still face various limitations such as poor controllability, subpar reliability, large variability, and non-symmetrical resistance response.","PeriodicalId":269641,"journal":{"name":"2018 76th Device Research Conference (DRC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Energy-Efficient, Two-Dimensional Analog Memory for Neuromorphic Computing\",\"authors\":\"M. Sharbati, Yanhao Du, Feng Xiong\",\"doi\":\"10.1109/DRC.2018.8442140\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Unlike modern computers that use digital ‘0’ and ‘1’ for computation, neural networks in human brains exhibit analog changes in neural connections (i.e. synaptic weight) during the decision-making and learning processes. This analog nature as well as the neural network's massive parallelism are partly why human brains (~20 W) are much better at complex tasks such as pattern recognition than even the most powerful computers (~1 MW) with significantly better energy efficiency. Currently, majority of the research efforts towards developing artificial neural networks are based on digital technology with CMOS devices [1], which cannot mimic the analog behaviors of biological synapses and thus energy-extensive. Recently, emerging memory devices such as phase change memory (PCM), resistive random access memory (RRAM), and spin-torque transfer (STT) RAM [2]–[4] have been studied to mimic synaptic connections with their programmable conductance. While these approaches are promising, they still face various limitations such as poor controllability, subpar reliability, large variability, and non-symmetrical resistance response.\",\"PeriodicalId\":269641,\"journal\":{\"name\":\"2018 76th Device Research Conference (DRC)\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 76th Device Research Conference (DRC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DRC.2018.8442140\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 76th Device Research Conference (DRC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DRC.2018.8442140","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Energy-Efficient, Two-Dimensional Analog Memory for Neuromorphic Computing
Unlike modern computers that use digital ‘0’ and ‘1’ for computation, neural networks in human brains exhibit analog changes in neural connections (i.e. synaptic weight) during the decision-making and learning processes. This analog nature as well as the neural network's massive parallelism are partly why human brains (~20 W) are much better at complex tasks such as pattern recognition than even the most powerful computers (~1 MW) with significantly better energy efficiency. Currently, majority of the research efforts towards developing artificial neural networks are based on digital technology with CMOS devices [1], which cannot mimic the analog behaviors of biological synapses and thus energy-extensive. Recently, emerging memory devices such as phase change memory (PCM), resistive random access memory (RRAM), and spin-torque transfer (STT) RAM [2]–[4] have been studied to mimic synaptic connections with their programmable conductance. While these approaches are promising, they still face various limitations such as poor controllability, subpar reliability, large variability, and non-symmetrical resistance response.