{"title":"提高反向传播网络训练速度的权值初始化","authors":"Young-Ik Kim, Jong Beom Ra","doi":"10.1109/IJCNN.1991.170747","DOIUrl":null,"url":null,"abstract":"A method for initialization of the weight values of multilayer feedforward neural networks is proposed to improve the learning speed of a network. The proposed method suggests the minimum bound of the weights based on dynamics of decision boundaries, which is derived from the generalized delta rule. Computer simulation in several neural network models showed that the proper selection of the initial weight values improves the learning ability and contributed to fast convergence.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"69","resultStr":"{\"title\":\"Weight value initialization for improving training speed in the backpropagation network\",\"authors\":\"Young-Ik Kim, Jong Beom Ra\",\"doi\":\"10.1109/IJCNN.1991.170747\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A method for initialization of the weight values of multilayer feedforward neural networks is proposed to improve the learning speed of a network. The proposed method suggests the minimum bound of the weights based on dynamics of decision boundaries, which is derived from the generalized delta rule. Computer simulation in several neural network models showed that the proper selection of the initial weight values improves the learning ability and contributed to fast convergence.<<ETX>>\",\"PeriodicalId\":211135,\"journal\":{\"name\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-11-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"69\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1991.170747\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1991.170747","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Weight value initialization for improving training speed in the backpropagation network
A method for initialization of the weight values of multilayer feedforward neural networks is proposed to improve the learning speed of a network. The proposed method suggests the minimum bound of the weights based on dynamics of decision boundaries, which is derived from the generalized delta rule. Computer simulation in several neural network models showed that the proper selection of the initial weight values improves the learning ability and contributed to fast convergence.<>