{"title":"基于局部独立特征的非参数混合高斯朴素贝叶斯分类器","authors":"Ali Haghpanah Jahromi, M. Taheri","doi":"10.1109/AISP.2017.8324083","DOIUrl":null,"url":null,"abstract":"The naive Bayes is one of the useful classification techniques in data mining and machine learning. Although naive Bayes learners are efficient, they suffer from the weak assumption of conditional independence between the attributes. Many algorithms have been proposed to improve the effectiveness of naive Bayes classifier by inserting discriminant approaches into its generative structure. Combining generative and discriminative viewpoints is done in many algorithms e.g. by use of attribute weighting, instance weighting or ensemble method. In this paper, a new ensemble of Gaussian naive Bayes classifiers is proposed based on the mixture of Gaussian distributions formed on less conditional dependent features extracted by local PCA. A semi-AdaBoost approach is used for dynamic adaptation of distributions considering misclassified instances. The proposed method has been evaluated and compared with the related work on 12 UCI machine learning datasets and achievements show significant improvement on the performance.","PeriodicalId":386952,"journal":{"name":"2017 Artificial Intelligence and Signal Processing Conference (AISP)","volume":"57 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"96","resultStr":"{\"title\":\"A non-parametric mixture of Gaussian naive Bayes classifiers based on local independent features\",\"authors\":\"Ali Haghpanah Jahromi, M. Taheri\",\"doi\":\"10.1109/AISP.2017.8324083\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The naive Bayes is one of the useful classification techniques in data mining and machine learning. Although naive Bayes learners are efficient, they suffer from the weak assumption of conditional independence between the attributes. Many algorithms have been proposed to improve the effectiveness of naive Bayes classifier by inserting discriminant approaches into its generative structure. Combining generative and discriminative viewpoints is done in many algorithms e.g. by use of attribute weighting, instance weighting or ensemble method. In this paper, a new ensemble of Gaussian naive Bayes classifiers is proposed based on the mixture of Gaussian distributions formed on less conditional dependent features extracted by local PCA. A semi-AdaBoost approach is used for dynamic adaptation of distributions considering misclassified instances. The proposed method has been evaluated and compared with the related work on 12 UCI machine learning datasets and achievements show significant improvement on the performance.\",\"PeriodicalId\":386952,\"journal\":{\"name\":\"2017 Artificial Intelligence and Signal Processing Conference (AISP)\",\"volume\":\"57 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"96\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 Artificial Intelligence and Signal Processing Conference (AISP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AISP.2017.8324083\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 Artificial Intelligence and Signal Processing Conference (AISP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AISP.2017.8324083","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A non-parametric mixture of Gaussian naive Bayes classifiers based on local independent features
The naive Bayes is one of the useful classification techniques in data mining and machine learning. Although naive Bayes learners are efficient, they suffer from the weak assumption of conditional independence between the attributes. Many algorithms have been proposed to improve the effectiveness of naive Bayes classifier by inserting discriminant approaches into its generative structure. Combining generative and discriminative viewpoints is done in many algorithms e.g. by use of attribute weighting, instance weighting or ensemble method. In this paper, a new ensemble of Gaussian naive Bayes classifiers is proposed based on the mixture of Gaussian distributions formed on less conditional dependent features extracted by local PCA. A semi-AdaBoost approach is used for dynamic adaptation of distributions considering misclassified instances. The proposed method has been evaluated and compared with the related work on 12 UCI machine learning datasets and achievements show significant improvement on the performance.