{"title":"Shape recognition with nearest neighbor isomorphic network","authors":"H. Yau, M. Manry","doi":"10.1109/NNSP.1991.239517","DOIUrl":null,"url":null,"abstract":"The nearest neighbor isomorphic network paradigm is a combination of sigma-pi units in the hidden layer and product units in the output layer. Good initial weights can be found through clustering of the input training vectors, and the network can be successfully trained via backpropagation learning. The authors show theoretical conditions under which the product operation can replace the Min operation. Advantages to the product operation are summarized. Under some sufficient conditions, the product operation yields the same classification results as the Min operation. They apply their algorithm to a geometric shape recognition problem and compare the performances with those of two other well-known algorithms.<<ETX>>","PeriodicalId":354832,"journal":{"name":"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop","volume":"102 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.1991.239517","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The nearest neighbor isomorphic network paradigm is a combination of sigma-pi units in the hidden layer and product units in the output layer. Good initial weights can be found through clustering of the input training vectors, and the network can be successfully trained via backpropagation learning. The authors show theoretical conditions under which the product operation can replace the Min operation. Advantages to the product operation are summarized. Under some sufficient conditions, the product operation yields the same classification results as the Min operation. They apply their algorithm to a geometric shape recognition problem and compare the performances with those of two other well-known algorithms.<>