The Computational Capacity of Neurons with Binary Weights or, Why it is OK to be a Bit Neuron

S. S. Venkatesh
{"title":"The Computational Capacity of Neurons with Binary Weights or, Why it is OK to be a Bit Neuron","authors":"S. S. Venkatesh","doi":"10.1109/ITW.1989.761439","DOIUrl":null,"url":null,"abstract":"synaptic weights prescribed in the algorithms might, at least for some applications, be an artifact of the algorithms used. Practical considerations in the building of hardware for these networks also dictate the study of the effect of imposing limited dynamic ranges on coefficients on computational capacity and learning in these neural network models. There are, hence, cogent theoretical and practical spurs to study networks with dynamic range limited synapses. We investigate two fundamental issues: Can we compute efficiently with these networks? Can constrained networks learn? Setting up a simple network model with binary weights, we argue that there is very little loss in eschewing real interconnections in favour of binary links. We demonstrate rigorously that the computational capacity scales gracefully when synaptic dynamic range is reduced from the continuum to a single bit: with binary connections the achievable capacity is as much as half that with real interconnections. Analogous results appear to hold for learning within the constraints of binary interconnections. Convergence rates for binary learning are reduced, but there is qualitative similarity to learning performance without constraints. While the actual mathematical demonstrations are quire involved, the algorithms themselves are quite simple and appeal persuasively to intuition. Based in part on the thesis that it is arguably easier to implement binary links than real interconnections, researchers have been building small prototype networks which appear to function reasonably we1L3 Our results appear to provide theoretical support for such ventures. It may be possible to generalise the results to other situations involving distributed","PeriodicalId":413028,"journal":{"name":"IEEE/CAM Information Theory Workshop at Cornell","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1989-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE/CAM Information Theory Workshop at Cornell","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITW.1989.761439","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

synaptic weights prescribed in the algorithms might, at least for some applications, be an artifact of the algorithms used. Practical considerations in the building of hardware for these networks also dictate the study of the effect of imposing limited dynamic ranges on coefficients on computational capacity and learning in these neural network models. There are, hence, cogent theoretical and practical spurs to study networks with dynamic range limited synapses. We investigate two fundamental issues: Can we compute efficiently with these networks? Can constrained networks learn? Setting up a simple network model with binary weights, we argue that there is very little loss in eschewing real interconnections in favour of binary links. We demonstrate rigorously that the computational capacity scales gracefully when synaptic dynamic range is reduced from the continuum to a single bit: with binary connections the achievable capacity is as much as half that with real interconnections. Analogous results appear to hold for learning within the constraints of binary interconnections. Convergence rates for binary learning are reduced, but there is qualitative similarity to learning performance without constraints. While the actual mathematical demonstrations are quire involved, the algorithms themselves are quite simple and appeal persuasively to intuition. Based in part on the thesis that it is arguably easier to implement binary links than real interconnections, researchers have been building small prototype networks which appear to function reasonably we1L3 Our results appear to provide theoretical support for such ventures. It may be possible to generalise the results to other situations involving distributed
二元权值神经元的计算能力,或者,为什么可以是位神经元
至少在某些应用程序中,算法中规定的突触权重可能是所使用算法的产物。在为这些网络构建硬件时的实际考虑也要求研究在这些神经网络模型中施加有限动态范围的系数对计算能力和学习的影响。因此,对动态范围有限的突触网络的研究在理论和实践上都有很大的推动作用。我们研究两个基本问题:我们能用这些网络高效地进行计算吗?受约束的网络能学习吗?建立了一个简单的二元权重网络模型,我们认为避免真实互连而支持二元链路的损失很小。我们严谨地证明,当突触动态范围从连续体减小到单个比特时,计算能力会优雅地扩展:使用二进制连接,可实现的容量是使用真实互连时的一半。类似的结果似乎适用于二元互连约束下的学习。二元学习的收敛速度降低了,但与没有约束的学习性能有质的相似性。虽然实际的数学演示相当复杂,但算法本身非常简单,并且对直觉有说服力。部分基于这样一种观点,即实现二进制链接比真正的互连更容易,研究人员一直在构建小型原型网络,这些网络似乎运行得相当好。我们的研究结果似乎为这种冒险提供了理论支持。也许可以将结果推广到涉及分布的其他情况
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信