M. Flores, J. A. Gamez, Ana M. Martínez, A. Salmerón
{"title":"Mixture of truncated exponentials in supervised classification: Case study for the naive bayes and averaged one-dependence estimators classifiers","authors":"M. Flores, J. A. Gamez, Ana M. Martínez, A. Salmerón","doi":"10.1109/ISDA.2011.6121720","DOIUrl":null,"url":null,"abstract":"The Averaged One-Dependence Estimators (AODE) classifier is one of the most attractive semi-naive Bayesian classifiers and hence a good alternative to Naive Bayes (NB), as it obtains fairly low error rates maintaining under control the computational complexity. Unfortunately, as most of the methods designed within the framework of Bayesian networks, AODE is exclusively defined to deal with discrete variables. Several approaches to avoid the use of discretization pre-processing techniques have already been presented, all of them involving in lower or greater degree the assumption of (conditional) Gaussian distributions. In this paper, we propose the use of Mixture of Truncated Exponentials (MTEs), whose expressive power to accurately approximate the most commonly used distributions for hybrid networks has already been demonstrated. We perform experiments on the use of MTEs over a large group of datasets for the first time, and we analyze the importance of selecting a proper number of points when learning MTEs for NB and AODE, as we believe, it is decisive to provide accurate results.","PeriodicalId":433207,"journal":{"name":"2011 11th International Conference on Intelligent Systems Design and Applications","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 11th International Conference on Intelligent Systems Design and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISDA.2011.6121720","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
The Averaged One-Dependence Estimators (AODE) classifier is one of the most attractive semi-naive Bayesian classifiers and hence a good alternative to Naive Bayes (NB), as it obtains fairly low error rates maintaining under control the computational complexity. Unfortunately, as most of the methods designed within the framework of Bayesian networks, AODE is exclusively defined to deal with discrete variables. Several approaches to avoid the use of discretization pre-processing techniques have already been presented, all of them involving in lower or greater degree the assumption of (conditional) Gaussian distributions. In this paper, we propose the use of Mixture of Truncated Exponentials (MTEs), whose expressive power to accurately approximate the most commonly used distributions for hybrid networks has already been demonstrated. We perform experiments on the use of MTEs over a large group of datasets for the first time, and we analyze the importance of selecting a proper number of points when learning MTEs for NB and AODE, as we believe, it is decisive to provide accurate results.