{"title":"在存在噪声和冗余的情况下学习异或问题","authors":"D. Cousineau","doi":"10.1109/IJCNN.2005.1556226","DOIUrl":null,"url":null,"abstract":"Recently introduced time-based networks represent an alternative to the usual strength-based networks. In this paper, we compare two instances of each family of networks that are of comparable complexity, the perceptron and the race network when faced with uncertain input. Uncertainty was manipulated in two different ways, within channel by adding noise and between channels by adding redundant inputs. For the perceptron, results indicate that if noise is high, redundancy must be low (or vice versa), otherwise learning does not occur. For the race network, the opposite is true: if both noise and redundancy increase, learning remains both fast and reliable. Asymptotic statistic theories suggest that these results may be true of all the networks belonging to these two families. Thus, redundancy is a non trivial factor","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Learning of an XOR problem in the presence of noise and redundancy\",\"authors\":\"D. Cousineau\",\"doi\":\"10.1109/IJCNN.2005.1556226\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recently introduced time-based networks represent an alternative to the usual strength-based networks. In this paper, we compare two instances of each family of networks that are of comparable complexity, the perceptron and the race network when faced with uncertain input. Uncertainty was manipulated in two different ways, within channel by adding noise and between channels by adding redundant inputs. For the perceptron, results indicate that if noise is high, redundancy must be low (or vice versa), otherwise learning does not occur. For the race network, the opposite is true: if both noise and redundancy increase, learning remains both fast and reliable. Asymptotic statistic theories suggest that these results may be true of all the networks belonging to these two families. Thus, redundancy is a non trivial factor\",\"PeriodicalId\":365690,\"journal\":{\"name\":\"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-12-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2005.1556226\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2005.1556226","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Learning of an XOR problem in the presence of noise and redundancy
Recently introduced time-based networks represent an alternative to the usual strength-based networks. In this paper, we compare two instances of each family of networks that are of comparable complexity, the perceptron and the race network when faced with uncertain input. Uncertainty was manipulated in two different ways, within channel by adding noise and between channels by adding redundant inputs. For the perceptron, results indicate that if noise is high, redundancy must be low (or vice versa), otherwise learning does not occur. For the race network, the opposite is true: if both noise and redundancy increase, learning remains both fast and reliable. Asymptotic statistic theories suggest that these results may be true of all the networks belonging to these two families. Thus, redundancy is a non trivial factor