{"title":"Subhealth state classification with AdaBoost learner","authors":"Sheng Sun, Zhiya Zuo, Guozheng Li, Xiao-bo Yang","doi":"10.1504/IJFIPM.2013.057406","DOIUrl":null,"url":null,"abstract":"Biopsychosocial approaches are the mainstay diagnostic methods for subhealth. This paper introduces the AdaBoost Learner to handle this issue. AdaBoost algorithm combine a series of weak classifiers, each of which performs slightly better than random guessing, to a strong one. In this paper, the AdaBoost Learners with discriminant classifiers and decision trees are built and two strong classifiers, support vector machine (SVM) and k–nearest neighbour (kNN), are adopted as control experiments. Two classification processes are constructed to distinguish health states and subhealth types respectively, where Fisher Score feature selection is for comparing performance with different feature subsets. Results indicate that the AdaBoost Learner with decision trees is the best among four classifiers in health states classification while the one with discriminant classifiers has the greatest performance in subhealth types classification. In health states classification, the highest accuracy reached 85.76% with 320 questions and 87.58% with 120 questions in subhealth types classification.","PeriodicalId":216126,"journal":{"name":"Int. J. Funct. Informatics Pers. Medicine","volume":"46 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Funct. Informatics Pers. Medicine","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1504/IJFIPM.2013.057406","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Biopsychosocial approaches are the mainstay diagnostic methods for subhealth. This paper introduces the AdaBoost Learner to handle this issue. AdaBoost algorithm combine a series of weak classifiers, each of which performs slightly better than random guessing, to a strong one. In this paper, the AdaBoost Learners with discriminant classifiers and decision trees are built and two strong classifiers, support vector machine (SVM) and k–nearest neighbour (kNN), are adopted as control experiments. Two classification processes are constructed to distinguish health states and subhealth types respectively, where Fisher Score feature selection is for comparing performance with different feature subsets. Results indicate that the AdaBoost Learner with decision trees is the best among four classifiers in health states classification while the one with discriminant classifiers has the greatest performance in subhealth types classification. In health states classification, the highest accuracy reached 85.76% with 320 questions and 87.58% with 120 questions in subhealth types classification.