{"title":"基于迭代调整误差阈值的并行神经学习","authors":"T. Hong, Jyh-Jong Lee","doi":"10.1109/ICPADS.1998.741026","DOIUrl":null,"url":null,"abstract":"We first propose a modified backpropagation learning algorithm that incrementally decreases the error threshold by half in order to process training instances with large weight changes as quickly as possible. This modified backpropagation learning algorithm is then parallelized using the single-channel broadcast communication model to n processors, where n is the number of training instances. Finally, the parallel backpropagation learning algorithm is modified for execution on a bounded number of processors to cope with real-world conditions.","PeriodicalId":226947,"journal":{"name":"Proceedings 1998 International Conference on Parallel and Distributed Systems (Cat. No.98TB100250)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1998-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Parallel neural learning by iteratively adjusting error thresholds\",\"authors\":\"T. Hong, Jyh-Jong Lee\",\"doi\":\"10.1109/ICPADS.1998.741026\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We first propose a modified backpropagation learning algorithm that incrementally decreases the error threshold by half in order to process training instances with large weight changes as quickly as possible. This modified backpropagation learning algorithm is then parallelized using the single-channel broadcast communication model to n processors, where n is the number of training instances. Finally, the parallel backpropagation learning algorithm is modified for execution on a bounded number of processors to cope with real-world conditions.\",\"PeriodicalId\":226947,\"journal\":{\"name\":\"Proceedings 1998 International Conference on Parallel and Distributed Systems (Cat. No.98TB100250)\",\"volume\":\"67 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1998-12-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings 1998 International Conference on Parallel and Distributed Systems (Cat. No.98TB100250)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICPADS.1998.741026\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 1998 International Conference on Parallel and Distributed Systems (Cat. No.98TB100250)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPADS.1998.741026","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Parallel neural learning by iteratively adjusting error thresholds
We first propose a modified backpropagation learning algorithm that incrementally decreases the error threshold by half in order to process training instances with large weight changes as quickly as possible. This modified backpropagation learning algorithm is then parallelized using the single-channel broadcast communication model to n processors, where n is the number of training instances. Finally, the parallel backpropagation learning algorithm is modified for execution on a bounded number of processors to cope with real-world conditions.