{"title":"A fixed point implementation of the backpropagation learning algorithm","authors":"R.K. Presley, R. Haggard","doi":"10.1109/SECON.1994.324283","DOIUrl":null,"url":null,"abstract":"In hardware implementations of digital artificial neural networks, the amount of logic that can be utilized is limited. Due to this limitation, learning algorithms that are to be executed in hardware must be implemented using fixed point arithmetic. Adapting the backpropagation learning algorithm to a fixed point arithmetic system requires many approximations, scaling techniques and the use of lookup tables. These methods are explained. The convergence results for a test example using fixed point, floating point and hardware implementations of the backpropagation algorithm are presented.<<ETX>>","PeriodicalId":119615,"journal":{"name":"Proceedings of SOUTHEASTCON '94","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of SOUTHEASTCON '94","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SECON.1994.324283","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
In hardware implementations of digital artificial neural networks, the amount of logic that can be utilized is limited. Due to this limitation, learning algorithms that are to be executed in hardware must be implemented using fixed point arithmetic. Adapting the backpropagation learning algorithm to a fixed point arithmetic system requires many approximations, scaling techniques and the use of lookup tables. These methods are explained. The convergence results for a test example using fixed point, floating point and hardware implementations of the backpropagation algorithm are presented.<>