在存在噪声和冗余的情况下学习异或问题

D. Cousineau
{"title":"在存在噪声和冗余的情况下学习异或问题","authors":"D. Cousineau","doi":"10.1109/IJCNN.2005.1556226","DOIUrl":null,"url":null,"abstract":"Recently introduced time-based networks represent an alternative to the usual strength-based networks. In this paper, we compare two instances of each family of networks that are of comparable complexity, the perceptron and the race network when faced with uncertain input. Uncertainty was manipulated in two different ways, within channel by adding noise and between channels by adding redundant inputs. For the perceptron, results indicate that if noise is high, redundancy must be low (or vice versa), otherwise learning does not occur. For the race network, the opposite is true: if both noise and redundancy increase, learning remains both fast and reliable. Asymptotic statistic theories suggest that these results may be true of all the networks belonging to these two families. Thus, redundancy is a non trivial factor","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Learning of an XOR problem in the presence of noise and redundancy\",\"authors\":\"D. Cousineau\",\"doi\":\"10.1109/IJCNN.2005.1556226\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recently introduced time-based networks represent an alternative to the usual strength-based networks. In this paper, we compare two instances of each family of networks that are of comparable complexity, the perceptron and the race network when faced with uncertain input. Uncertainty was manipulated in two different ways, within channel by adding noise and between channels by adding redundant inputs. For the perceptron, results indicate that if noise is high, redundancy must be low (or vice versa), otherwise learning does not occur. For the race network, the opposite is true: if both noise and redundancy increase, learning remains both fast and reliable. Asymptotic statistic theories suggest that these results may be true of all the networks belonging to these two families. Thus, redundancy is a non trivial factor\",\"PeriodicalId\":365690,\"journal\":{\"name\":\"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-12-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2005.1556226\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2005.1556226","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

最近引入的基于时间的网络代表了通常基于强度的网络的另一种选择。在本文中,我们比较了在面对不确定输入时,感知器和竞赛网络这两类具有相当复杂性的网络的两个实例。不确定性以两种不同的方式被操纵,在信道内通过添加噪声和在信道之间通过添加冗余输入。对于感知器,结果表明,如果噪声高,冗余必须低(反之亦然),否则不会发生学习。对于竞赛网络,情况正好相反:如果噪声和冗余都增加,学习仍然是快速和可靠的。渐近统计理论表明,这些结果可能对属于这两个族的所有网络都成立。因此,冗余是一个重要的因素
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Learning of an XOR problem in the presence of noise and redundancy
Recently introduced time-based networks represent an alternative to the usual strength-based networks. In this paper, we compare two instances of each family of networks that are of comparable complexity, the perceptron and the race network when faced with uncertain input. Uncertainty was manipulated in two different ways, within channel by adding noise and between channels by adding redundant inputs. For the perceptron, results indicate that if noise is high, redundancy must be low (or vice versa), otherwise learning does not occur. For the race network, the opposite is true: if both noise and redundancy increase, learning remains both fast and reliable. Asymptotic statistic theories suggest that these results may be true of all the networks belonging to these two families. Thus, redundancy is a non trivial factor
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信