Henry E. L. Cagnini, M. Basgalupp, Rodrigo C. Barros
{"title":"Increasing Boosting Effectiveness with Estimation of Distribution Algorithms","authors":"Henry E. L. Cagnini, M. Basgalupp, Rodrigo C. Barros","doi":"10.1109/CEC.2018.8477959","DOIUrl":null,"url":null,"abstract":"Ensemble learning is the machine learning paradigm that aims at integrating several base learners into a single system under the assumption that the collective consensus outperforms a single strong learner, be it regarding effectiveness, efficiency, or any other problem-specific metric. Ensemble learning comprises three main phases: generation, selection, and integration, and there are several possible (deterministic or stochastic) strategies for executing one or more of those phases. In this paper, our focus is on improving the predictive accuracy of the well-known AdaBoost algorithm. By using its former voting weights as starting point in a global search carried by an Estimation of Distribution Algorithm, we are capable of improving AdaBoost up to $\\approx 11$ % regarding predictive accuracy in a thorough experimental analysis with multiple public datasets.","PeriodicalId":212677,"journal":{"name":"2018 IEEE Congress on Evolutionary Computation (CEC)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE Congress on Evolutionary Computation (CEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2018.8477959","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Ensemble learning is the machine learning paradigm that aims at integrating several base learners into a single system under the assumption that the collective consensus outperforms a single strong learner, be it regarding effectiveness, efficiency, or any other problem-specific metric. Ensemble learning comprises three main phases: generation, selection, and integration, and there are several possible (deterministic or stochastic) strategies for executing one or more of those phases. In this paper, our focus is on improving the predictive accuracy of the well-known AdaBoost algorithm. By using its former voting weights as starting point in a global search carried by an Estimation of Distribution Algorithm, we are capable of improving AdaBoost up to $\approx 11$ % regarding predictive accuracy in a thorough experimental analysis with multiple public datasets.