{"title":"多层神经网络的强制选择信息约简","authors":"R. Kamimura, Ryozo Kitajima","doi":"10.29007/n4kz","DOIUrl":null,"url":null,"abstract":"The present paper aims to reduce unnecessary information obtained through inputs, supposed to be inappropriately encoded, for producing easily interpretable networks with better generalization. The proposed method lies mainly in forced reduction of selective information even at the expense of a larger cost to eliminate unnecessary information coming from the inputs in the initial stage of learning. Then, in the later stage of learning, selective information is increased to produce a small number of really important connection weights for learning. The method was preliminarily applied to two business data sets: the bankruptcy and the mission statement data sets, in which the interpretation is considered as important as generalization performance. The results show that selective information could be decreased, though the cost to realize this reduction became larger. However, the accompa- nying selective information increase could be used to compensate for the expensive cost to produce simpler and interpretable internal representations with better generalization performance.","PeriodicalId":93549,"journal":{"name":"EPiC series in computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Forced Selective Information Reduction for Interpreting Multi-Layered Neural Networks\",\"authors\":\"R. Kamimura, Ryozo Kitajima\",\"doi\":\"10.29007/n4kz\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The present paper aims to reduce unnecessary information obtained through inputs, supposed to be inappropriately encoded, for producing easily interpretable networks with better generalization. The proposed method lies mainly in forced reduction of selective information even at the expense of a larger cost to eliminate unnecessary information coming from the inputs in the initial stage of learning. Then, in the later stage of learning, selective information is increased to produce a small number of really important connection weights for learning. The method was preliminarily applied to two business data sets: the bankruptcy and the mission statement data sets, in which the interpretation is considered as important as generalization performance. The results show that selective information could be decreased, though the cost to realize this reduction became larger. However, the accompa- nying selective information increase could be used to compensate for the expensive cost to produce simpler and interpretable internal representations with better generalization performance.\",\"PeriodicalId\":93549,\"journal\":{\"name\":\"EPiC series in computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"EPiC series in computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.29007/n4kz\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"EPiC series in computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.29007/n4kz","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Forced Selective Information Reduction for Interpreting Multi-Layered Neural Networks
The present paper aims to reduce unnecessary information obtained through inputs, supposed to be inappropriately encoded, for producing easily interpretable networks with better generalization. The proposed method lies mainly in forced reduction of selective information even at the expense of a larger cost to eliminate unnecessary information coming from the inputs in the initial stage of learning. Then, in the later stage of learning, selective information is increased to produce a small number of really important connection weights for learning. The method was preliminarily applied to two business data sets: the bankruptcy and the mission statement data sets, in which the interpretation is considered as important as generalization performance. The results show that selective information could be decreased, though the cost to realize this reduction became larger. However, the accompa- nying selective information increase could be used to compensate for the expensive cost to produce simpler and interpretable internal representations with better generalization performance.