{"title":"基于非线性组合分类器的模式分类改进","authors":"Mohammed Falih Hassan, I. Abdel-Qader","doi":"10.1109/ICCI-CC.2016.7862081","DOIUrl":null,"url":null,"abstract":"In order to improve classification accuracy, multiple classifier systems have provided better pattern classification over single classifier systems in different applications. The theoretical frameworks proposed in [5], [7] present important tools for estimating and minimizing the added error of linearly combined classifier systems. In this work, we present a theoretical model that estimates the added error using geometric mean rule which is a nonlinear combining rule. In the derivation, we assume classifiers' outputs are uncorrelated and have identical distributions for a given class case. We also show that by setting the number of classifiers to one (a single classifier system), the derived formula is modified and matches the results given in [5]. We validated our derivations with computer simulations and compared these with the analytical results. Due to the nonlinearity of the geometric mean, theoretical results show that the bias and variance errors are mixed together in their contribution to the added error. It was shown that the bias error dominated the contribution to the added error compared to the variance error. It is possible to minimize the variance error by increasing the ensemble size (number of classifiers) while the bias error is minimized under certain conditions. The proposed theoretical work can help in investigating the added error for other nonlinear arithmetic combining rules.","PeriodicalId":135701,"journal":{"name":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Improving pattern classification by nonlinearly combined classifiers\",\"authors\":\"Mohammed Falih Hassan, I. Abdel-Qader\",\"doi\":\"10.1109/ICCI-CC.2016.7862081\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In order to improve classification accuracy, multiple classifier systems have provided better pattern classification over single classifier systems in different applications. The theoretical frameworks proposed in [5], [7] present important tools for estimating and minimizing the added error of linearly combined classifier systems. In this work, we present a theoretical model that estimates the added error using geometric mean rule which is a nonlinear combining rule. In the derivation, we assume classifiers' outputs are uncorrelated and have identical distributions for a given class case. We also show that by setting the number of classifiers to one (a single classifier system), the derived formula is modified and matches the results given in [5]. We validated our derivations with computer simulations and compared these with the analytical results. Due to the nonlinearity of the geometric mean, theoretical results show that the bias and variance errors are mixed together in their contribution to the added error. It was shown that the bias error dominated the contribution to the added error compared to the variance error. It is possible to minimize the variance error by increasing the ensemble size (number of classifiers) while the bias error is minimized under certain conditions. The proposed theoretical work can help in investigating the added error for other nonlinear arithmetic combining rules.\",\"PeriodicalId\":135701,\"journal\":{\"name\":\"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCI-CC.2016.7862081\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE 15th International Conference on Cognitive Informatics & Cognitive Computing (ICCI*CC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCI-CC.2016.7862081","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improving pattern classification by nonlinearly combined classifiers
In order to improve classification accuracy, multiple classifier systems have provided better pattern classification over single classifier systems in different applications. The theoretical frameworks proposed in [5], [7] present important tools for estimating and minimizing the added error of linearly combined classifier systems. In this work, we present a theoretical model that estimates the added error using geometric mean rule which is a nonlinear combining rule. In the derivation, we assume classifiers' outputs are uncorrelated and have identical distributions for a given class case. We also show that by setting the number of classifiers to one (a single classifier system), the derived formula is modified and matches the results given in [5]. We validated our derivations with computer simulations and compared these with the analytical results. Due to the nonlinearity of the geometric mean, theoretical results show that the bias and variance errors are mixed together in their contribution to the added error. It was shown that the bias error dominated the contribution to the added error compared to the variance error. It is possible to minimize the variance error by increasing the ensemble size (number of classifiers) while the bias error is minimized under certain conditions. The proposed theoretical work can help in investigating the added error for other nonlinear arithmetic combining rules.