{"title":"自组织网络的最优特征提取","authors":"Y. A. Ghassabeh, H. Moghaddam","doi":"10.1109/CIRA.2007.382908","DOIUrl":null,"url":null,"abstract":"In this paper, we introduced new adaptive learning algorithms and related networks to extract optimal features from multidimensional data in order to reduce the data dimensionality while preserving class separability. For this purpose, new adaptive algorithms for the computation of the square root of the inverse covariance matrix Sigma-1/2 are introduced. We introduce a new cost function related to the given adaptive learning algorithms in order to prove their convergence. Self organized Sigma-1/2 networks are constructed based on these algorithms. By cascading Sigma-1/2 network and an adaptive principal component analysis (APCA) network, we present new adaptive self organized LDA feature extraction network. Adaptive nature of the new optimal feature extraction method makes it appropriate for on-line incremental pattern classification and machine learning applications. Both networks in the proposed structure are trained simultaneously, using a stream of input data. Existence of cost function, make it available to compute learning rate efficiently in every iteration in order to increase the convergence rate. Experimental results using synthetic multi-class multi-dimensional sequence of data, demonstrated the effectiveness of the new adaptive self organized feature extraction networks.","PeriodicalId":301626,"journal":{"name":"2007 International Symposium on Computational Intelligence in Robotics and Automation","volume":"97 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Self Organized Networks for Optimal Feature Extraction\",\"authors\":\"Y. A. Ghassabeh, H. Moghaddam\",\"doi\":\"10.1109/CIRA.2007.382908\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we introduced new adaptive learning algorithms and related networks to extract optimal features from multidimensional data in order to reduce the data dimensionality while preserving class separability. For this purpose, new adaptive algorithms for the computation of the square root of the inverse covariance matrix Sigma-1/2 are introduced. We introduce a new cost function related to the given adaptive learning algorithms in order to prove their convergence. Self organized Sigma-1/2 networks are constructed based on these algorithms. By cascading Sigma-1/2 network and an adaptive principal component analysis (APCA) network, we present new adaptive self organized LDA feature extraction network. Adaptive nature of the new optimal feature extraction method makes it appropriate for on-line incremental pattern classification and machine learning applications. Both networks in the proposed structure are trained simultaneously, using a stream of input data. Existence of cost function, make it available to compute learning rate efficiently in every iteration in order to increase the convergence rate. Experimental results using synthetic multi-class multi-dimensional sequence of data, demonstrated the effectiveness of the new adaptive self organized feature extraction networks.\",\"PeriodicalId\":301626,\"journal\":{\"name\":\"2007 International Symposium on Computational Intelligence in Robotics and Automation\",\"volume\":\"97 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 International Symposium on Computational Intelligence in Robotics and Automation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIRA.2007.382908\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 International Symposium on Computational Intelligence in Robotics and Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIRA.2007.382908","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Self Organized Networks for Optimal Feature Extraction
In this paper, we introduced new adaptive learning algorithms and related networks to extract optimal features from multidimensional data in order to reduce the data dimensionality while preserving class separability. For this purpose, new adaptive algorithms for the computation of the square root of the inverse covariance matrix Sigma-1/2 are introduced. We introduce a new cost function related to the given adaptive learning algorithms in order to prove their convergence. Self organized Sigma-1/2 networks are constructed based on these algorithms. By cascading Sigma-1/2 network and an adaptive principal component analysis (APCA) network, we present new adaptive self organized LDA feature extraction network. Adaptive nature of the new optimal feature extraction method makes it appropriate for on-line incremental pattern classification and machine learning applications. Both networks in the proposed structure are trained simultaneously, using a stream of input data. Existence of cost function, make it available to compute learning rate efficiently in every iteration in order to increase the convergence rate. Experimental results using synthetic multi-class multi-dimensional sequence of data, demonstrated the effectiveness of the new adaptive self organized feature extraction networks.