Using parallel partitioning strategy to create diversity for ensemble learning

Yi-Min Wen, Yaonan Wang, Wen-Hua Liu
{"title":"Using parallel partitioning strategy to create diversity for ensemble learning","authors":"Yi-Min Wen, Yaonan Wang, Wen-Hua Liu","doi":"10.1109/ICCSIT.2009.5234490","DOIUrl":null,"url":null,"abstract":"Divide-and-conquer principle is a fashionable strategy to handle large-scale classification problems. However, many works have revealed that generalization ability is decreased by partitioning training set in most cases, because partitioning training set can lead to losing classification information. Aiming to handle this problem, an ensemble learning algorithm was proposed. It used many sets of parallel hyperplanes to partition training set on which each base classifier was trained by the SVM modular network algorithm and all these base classifiers were combined by majority voting strategy when testing. The experimental results on 4 classification problems illustrate that ensemble learning can effectively reduce the descent of generalization ability for the reason of increasing classifier's diversity.","PeriodicalId":342396,"journal":{"name":"2009 2nd IEEE International Conference on Computer Science and Information Technology","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 2nd IEEE International Conference on Computer Science and Information Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCSIT.2009.5234490","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Divide-and-conquer principle is a fashionable strategy to handle large-scale classification problems. However, many works have revealed that generalization ability is decreased by partitioning training set in most cases, because partitioning training set can lead to losing classification information. Aiming to handle this problem, an ensemble learning algorithm was proposed. It used many sets of parallel hyperplanes to partition training set on which each base classifier was trained by the SVM modular network algorithm and all these base classifiers were combined by majority voting strategy when testing. The experimental results on 4 classification problems illustrate that ensemble learning can effectively reduce the descent of generalization ability for the reason of increasing classifier's diversity.
采用并行分块策略为集成学习创造多样性
分治原则是处理大规模分类问题的一种流行策略。然而,许多研究表明,在大多数情况下,划分训练集会降低泛化能力,因为划分训练集会导致分类信息的丢失。针对这一问题,提出了一种集成学习算法。该方法利用多组并行超平面划分训练集,在训练集上使用支持向量机模块化网络算法对每个基分类器进行训练,在测试时采用多数投票策略对所有基分类器进行组合。对4个分类问题的实验结果表明,集成学习可以有效地减少由于分类器多样性增加而导致的泛化能力下降。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信