{"title":"Unseen family member classification using mixture of experts","authors":"M. Ghahramani, H. L. Wang, W. Yau, E. Teoh","doi":"10.1109/ICIEA.2010.5516872","DOIUrl":null,"url":null,"abstract":"All family members resemble each other in different ways which is recognizable by our brain. In this paper, we have developed family classification using AdaBoost, Support Vector Machines and K-Nearest Neighbor classifiers with different patches of training data. In some cases family classification involve unseen data classification in which the classifiers' performance drop significantly. Therefore Mixture of Experts is conducted to improve their performance. To have a fair comparison of mentioned approaches 3 different families from 3 different ethnic groups are used. Experimental results show that we can achieve an average accuracy rate of 76 percent and up to 27 percent accuracy improvement by using majority voting of mixture of experts depending on the family data.","PeriodicalId":234296,"journal":{"name":"2010 5th IEEE Conference on Industrial Electronics and Applications","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 5th IEEE Conference on Industrial Electronics and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIEA.2010.5516872","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
All family members resemble each other in different ways which is recognizable by our brain. In this paper, we have developed family classification using AdaBoost, Support Vector Machines and K-Nearest Neighbor classifiers with different patches of training data. In some cases family classification involve unseen data classification in which the classifiers' performance drop significantly. Therefore Mixture of Experts is conducted to improve their performance. To have a fair comparison of mentioned approaches 3 different families from 3 different ethnic groups are used. Experimental results show that we can achieve an average accuracy rate of 76 percent and up to 27 percent accuracy improvement by using majority voting of mixture of experts depending on the family data.