{"title":"Error analysis of quantized weights for feedforward neural networks (FNN)","authors":"Duanpei Wu, J. Gowdy","doi":"10.1109/SECON.1994.324361","DOIUrl":null,"url":null,"abstract":"When a neural network is implemented with limited precision hardware, errors from the quantization of weights become important factors to be considered. In this paper, the authors present several analysis results based on general FNN structures and use several examples to examine the relation between weight errors and output classifications. A lower bound for L, the number of bits used to quantize the weights, is derived in the worst case. This paper also includes the detailed analysis of AND-gates.<<ETX>>","PeriodicalId":119615,"journal":{"name":"Proceedings of SOUTHEASTCON '94","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of SOUTHEASTCON '94","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SECON.1994.324361","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
When a neural network is implemented with limited precision hardware, errors from the quantization of weights become important factors to be considered. In this paper, the authors present several analysis results based on general FNN structures and use several examples to examine the relation between weight errors and output classifications. A lower bound for L, the number of bits used to quantize the weights, is derived in the worst case. This paper also includes the detailed analysis of AND-gates.<>