Naïve基于oracle选择的贝叶斯集成学习

Kai Li, Lifeng Hao
{"title":"Naïve基于oracle选择的贝叶斯集成学习","authors":"Kai Li, Lifeng Hao","doi":"10.1109/CCDC.2009.5194867","DOIUrl":null,"url":null,"abstract":"Aiming at the stability of Naïve Bayes algorithm and overcoming the limitation of the attributes independence assumption in the Naive Bayes learning, we present an ensemble learning algorithm for naive Bayesian classifiers based on oracle selection (OSBE). Firstly we weaken the stability of the naive Bayes with oracle strategy, then select the better classifier as the component of ensemble of the naive Bayesian classifiers, finally integrate the classifiers' results with voting method. The experiments show that OSBE ensemble algorithm obviously improves the generalization performance which is compared with the Naïve Bayes learning. And it prove in some cases the OSBE algorithm have better classification accuracy than Bagging and Adaboost.","PeriodicalId":127110,"journal":{"name":"2009 Chinese Control and Decision Conference","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2009-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Naïve Bayes ensemble learning based on oracle selection\",\"authors\":\"Kai Li, Lifeng Hao\",\"doi\":\"10.1109/CCDC.2009.5194867\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Aiming at the stability of Naïve Bayes algorithm and overcoming the limitation of the attributes independence assumption in the Naive Bayes learning, we present an ensemble learning algorithm for naive Bayesian classifiers based on oracle selection (OSBE). Firstly we weaken the stability of the naive Bayes with oracle strategy, then select the better classifier as the component of ensemble of the naive Bayesian classifiers, finally integrate the classifiers' results with voting method. The experiments show that OSBE ensemble algorithm obviously improves the generalization performance which is compared with the Naïve Bayes learning. And it prove in some cases the OSBE algorithm have better classification accuracy than Bagging and Adaboost.\",\"PeriodicalId\":127110,\"journal\":{\"name\":\"2009 Chinese Control and Decision Conference\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2009-06-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2009 Chinese Control and Decision Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CCDC.2009.5194867\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 Chinese Control and Decision Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCDC.2009.5194867","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

摘要

针对Naïve贝叶斯算法的稳定性,克服朴素贝叶斯学习中属性独立假设的局限性,提出了一种基于oracle选择的朴素贝叶斯分类器集成学习算法(OSBE)。首先用oracle策略削弱朴素贝叶斯的稳定性,然后选择较好的分类器作为朴素贝叶斯分类器集合的组成部分,最后用投票法对分类器的结果进行集成。实验表明,与Naïve贝叶斯学习相比,OSBE集成算法明显提高了泛化性能。并证明在某些情况下OSBE算法比Bagging和Adaboost具有更好的分类精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Naïve Bayes ensemble learning based on oracle selection
Aiming at the stability of Naïve Bayes algorithm and overcoming the limitation of the attributes independence assumption in the Naive Bayes learning, we present an ensemble learning algorithm for naive Bayesian classifiers based on oracle selection (OSBE). Firstly we weaken the stability of the naive Bayes with oracle strategy, then select the better classifier as the component of ensemble of the naive Bayesian classifiers, finally integrate the classifiers' results with voting method. The experiments show that OSBE ensemble algorithm obviously improves the generalization performance which is compared with the Naïve Bayes learning. And it prove in some cases the OSBE algorithm have better classification accuracy than Bagging and Adaboost.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信