现场可编程门阵列中多层神经网络的增量通信

J. R. Dick, K. Kent
{"title":"现场可编程门阵列中多层神经网络的增量通信","authors":"J. R. Dick, K. Kent","doi":"10.1109/PACRIM.2005.1517362","DOIUrl":null,"url":null,"abstract":"A neural network is a massively parallel distributed processor made up of simple processing units known as neurons. These neurons are organized in layers and every neuron in each layer is connected to each neuron in the adjacent layers. This connection architecture makes for an enormous number of communication links between neurons. This is an issue when considering a hardware implementation of a neural network since communication links requires costly hardware space. To overcome this space problem incremental communication for multilayer neural networks has been proposed. Incremental communication works by only communicating the change in value between neurons as opposed to the entire magnitude of the value. This allows for the values to be represented with a fewer number of bits, and thus communicated with narrower communication links. To validate and analyze this technique a neural network is designed and implemented using both an incremental and traditional communication approach.","PeriodicalId":346880,"journal":{"name":"PACRIM. 2005 IEEE Pacific Rim Conference on Communications, Computers and signal Processing, 2005.","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Incremental communication for multilayer neural networks in a field programmable gate array\",\"authors\":\"J. R. Dick, K. Kent\",\"doi\":\"10.1109/PACRIM.2005.1517362\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A neural network is a massively parallel distributed processor made up of simple processing units known as neurons. These neurons are organized in layers and every neuron in each layer is connected to each neuron in the adjacent layers. This connection architecture makes for an enormous number of communication links between neurons. This is an issue when considering a hardware implementation of a neural network since communication links requires costly hardware space. To overcome this space problem incremental communication for multilayer neural networks has been proposed. Incremental communication works by only communicating the change in value between neurons as opposed to the entire magnitude of the value. This allows for the values to be represented with a fewer number of bits, and thus communicated with narrower communication links. To validate and analyze this technique a neural network is designed and implemented using both an incremental and traditional communication approach.\",\"PeriodicalId\":346880,\"journal\":{\"name\":\"PACRIM. 2005 IEEE Pacific Rim Conference on Communications, Computers and signal Processing, 2005.\",\"volume\":\"18 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"PACRIM. 2005 IEEE Pacific Rim Conference on Communications, Computers and signal Processing, 2005.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PACRIM.2005.1517362\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"PACRIM. 2005 IEEE Pacific Rim Conference on Communications, Computers and signal Processing, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PACRIM.2005.1517362","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

神经网络是一个大规模并行分布式处理器,由称为神经元的简单处理单元组成。这些神经元是分层组织的,每一层中的每个神经元都与相邻层中的每个神经元相连。这种连接结构使得神经元之间有大量的通信链接。当考虑神经网络的硬件实现时,这是一个问题,因为通信链路需要昂贵的硬件空间。为了克服这一空间问题,提出了多层神经网络的增量通信方法。增量通信的工作原理是只在神经元之间传递值的变化,而不是整个值的大小。这允许用更少的位数表示值,从而通过更窄的通信链路进行通信。为了验证和分析该技术,设计了一个神经网络,并使用增量和传统的通信方法实现了该网络。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Incremental communication for multilayer neural networks in a field programmable gate array
A neural network is a massively parallel distributed processor made up of simple processing units known as neurons. These neurons are organized in layers and every neuron in each layer is connected to each neuron in the adjacent layers. This connection architecture makes for an enormous number of communication links between neurons. This is an issue when considering a hardware implementation of a neural network since communication links requires costly hardware space. To overcome this space problem incremental communication for multilayer neural networks has been proposed. Incremental communication works by only communicating the change in value between neurons as opposed to the entire magnitude of the value. This allows for the values to be represented with a fewer number of bits, and thus communicated with narrower communication links. To validate and analyze this technique a neural network is designed and implemented using both an incremental and traditional communication approach.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信