{"title":"神经网络有限寄存器效应的信息论分析","authors":"M. Walker, L. Akers","doi":"10.1109/IJCNN.1992.226911","DOIUrl":null,"url":null,"abstract":"Information theory is used to analyze the effects of finite resolution and nonlinearities in multi-layered networks. The authors formulate the effect on the information content of the output of a neural processing element caused by storing continuous quantities in binary registers. The analysis reveals that the effect of quantization on information in a neural processing element is a function of the information content of the input, as well as the node nonlinearity and the length of the binary register containing the output. By casting traditional types of neural processing in statistical form, two classes of information processing in neural networks are identified. Each has widely different resolution requirements. Information theory is thus shown to provide a means of formalizing this taxonomy of neural network processing and is a method for linking the highly abstract processing performed by a neural network and the constraints of its implementation.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Information-theoretic analysis of finite register effects in neural networks\",\"authors\":\"M. Walker, L. Akers\",\"doi\":\"10.1109/IJCNN.1992.226911\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Information theory is used to analyze the effects of finite resolution and nonlinearities in multi-layered networks. The authors formulate the effect on the information content of the output of a neural processing element caused by storing continuous quantities in binary registers. The analysis reveals that the effect of quantization on information in a neural processing element is a function of the information content of the input, as well as the node nonlinearity and the length of the binary register containing the output. By casting traditional types of neural processing in statistical form, two classes of information processing in neural networks are identified. Each has widely different resolution requirements. Information theory is thus shown to provide a means of formalizing this taxonomy of neural network processing and is a method for linking the highly abstract processing performed by a neural network and the constraints of its implementation.<<ETX>>\",\"PeriodicalId\":286849,\"journal\":{\"name\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"volume\":\"44 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1992-06-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1992.226911\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1992.226911","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Information-theoretic analysis of finite register effects in neural networks
Information theory is used to analyze the effects of finite resolution and nonlinearities in multi-layered networks. The authors formulate the effect on the information content of the output of a neural processing element caused by storing continuous quantities in binary registers. The analysis reveals that the effect of quantization on information in a neural processing element is a function of the information content of the input, as well as the node nonlinearity and the length of the binary register containing the output. By casting traditional types of neural processing in statistical form, two classes of information processing in neural networks are identified. Each has widely different resolution requirements. Information theory is thus shown to provide a means of formalizing this taxonomy of neural network processing and is a method for linking the highly abstract processing performed by a neural network and the constraints of its implementation.<>