具有10240个“浮门”突触的电可训练人工神经网络(ETANN)

M. Holler, S. Tam, H. Castro, Robert G. Benson
{"title":"具有10240个“浮门”突触的电可训练人工神经网络(ETANN)","authors":"M. Holler, S. Tam, H. Castro, Robert G. Benson","doi":"10.1109/IJCNN.1989.118698","DOIUrl":null,"url":null,"abstract":"The use of floating-gate nonvolatile memory technology for analog storage of connection strengths, or weights, has previously been proposed and demonstrated. The authors report the analog storage and multiply characteristics of a new floating-gate synapse and further discuss the architecture of a neural network which uses this synapse cell. In the architecture described 8192 synapses are used to interconnect 64 neurons fully and to connect the 64 neurons to each of 64 inputs. Each synapse in the network multiplies a signed analog voltage by a stored weight and generates a differential current proportional to the product. Differential currents are summed on a pair of bit lines and transferred through a sigmoid function, appearing at the neuron output as an analog voltage. Input and output levels are compatible for ease in cascade-connecting these devices into multilayer networks. The width and height of weight-change pulses are calculated. The synapse cell size is 2009 mu m/sup 2/ using 1- mu m CMOS EEPROM technology.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1990-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"439","resultStr":"{\"title\":\"An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses\",\"authors\":\"M. Holler, S. Tam, H. Castro, Robert G. Benson\",\"doi\":\"10.1109/IJCNN.1989.118698\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The use of floating-gate nonvolatile memory technology for analog storage of connection strengths, or weights, has previously been proposed and demonstrated. The authors report the analog storage and multiply characteristics of a new floating-gate synapse and further discuss the architecture of a neural network which uses this synapse cell. In the architecture described 8192 synapses are used to interconnect 64 neurons fully and to connect the 64 neurons to each of 64 inputs. Each synapse in the network multiplies a signed analog voltage by a stored weight and generates a differential current proportional to the product. Differential currents are summed on a pair of bit lines and transferred through a sigmoid function, appearing at the neuron output as an analog voltage. Input and output levels are compatible for ease in cascade-connecting these devices into multilayer networks. The width and height of weight-change pulses are calculated. The synapse cell size is 2009 mu m/sup 2/ using 1- mu m CMOS EEPROM technology.<<ETX>>\",\"PeriodicalId\":199877,\"journal\":{\"name\":\"International 1989 Joint Conference on Neural Networks\",\"volume\":\"76 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1990-01-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"439\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International 1989 Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1989.118698\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International 1989 Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1989.118698","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 439

摘要

使用浮栅非易失性存储器技术进行连接强度或重量的模拟存储,以前已经提出并证明了这一点。本文报道了一种新型浮门突触的模拟存储和倍增特性,并进一步讨论了使用这种突触细胞的神经网络的结构。在所描述的架构中,8192个突触用于将64个神经元完全互连,并将64个神经元连接到64个输入中的每个输入。网络中的每个突触将带符号的模拟电压乘以存储的重量,并产生与乘积成正比的差分电流。差分电流在一对位线上相加,并通过s型函数传输,在神经元输出端以模拟电压的形式出现。输入和输出电平兼容,便于级联这些设备到多层网络。计算了重量变化脉冲的宽度和高度。突触细胞尺寸为2009 μ m/sup / /,采用1 μ m CMOS EEPROM技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses
The use of floating-gate nonvolatile memory technology for analog storage of connection strengths, or weights, has previously been proposed and demonstrated. The authors report the analog storage and multiply characteristics of a new floating-gate synapse and further discuss the architecture of a neural network which uses this synapse cell. In the architecture described 8192 synapses are used to interconnect 64 neurons fully and to connect the 64 neurons to each of 64 inputs. Each synapse in the network multiplies a signed analog voltage by a stored weight and generates a differential current proportional to the product. Differential currents are summed on a pair of bit lines and transferred through a sigmoid function, appearing at the neuron output as an analog voltage. Input and output levels are compatible for ease in cascade-connecting these devices into multilayer networks. The width and height of weight-change pulses are calculated. The synapse cell size is 2009 mu m/sup 2/ using 1- mu m CMOS EEPROM technology.<>
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信