Exploring the parameter state space of stacking

A. Seewald
{"title":"Exploring the parameter state space of stacking","authors":"A. Seewald","doi":"10.1109/ICDM.2002.1184029","DOIUrl":null,"url":null,"abstract":"Ensemble learning schemes are a new field in data mining. While current research concentrates mainly on improving the performance of single learning algorithms, an alternative is to combine learners with different biases. Stacking is the best-known such scheme which tries to combine learners' predictions or confidences via another learning algorithm. However, the adoption of stacking into the data mining community is hampered by its large parameter space, consisting mainly of other learning algorithms: (1) the set of learning algorithms to combine, (2) the meta-learner responsible for the combining; and (3) the type of meta-data to use - confidences or predictions. None of these parameters are obvious choices. Furthermore, little is known about the relation between the parameter settings and performance of stacking. By exploring all of stacking's parameter settings and their interdependencies, we attempt to make stacking a suitable choice for mainstream data mining applications.","PeriodicalId":405340,"journal":{"name":"2002 IEEE International Conference on Data Mining, 2002. Proceedings.","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"14","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2002 IEEE International Conference on Data Mining, 2002. Proceedings.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDM.2002.1184029","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 14

Abstract

Ensemble learning schemes are a new field in data mining. While current research concentrates mainly on improving the performance of single learning algorithms, an alternative is to combine learners with different biases. Stacking is the best-known such scheme which tries to combine learners' predictions or confidences via another learning algorithm. However, the adoption of stacking into the data mining community is hampered by its large parameter space, consisting mainly of other learning algorithms: (1) the set of learning algorithms to combine, (2) the meta-learner responsible for the combining; and (3) the type of meta-data to use - confidences or predictions. None of these parameters are obvious choices. Furthermore, little is known about the relation between the parameter settings and performance of stacking. By exploring all of stacking's parameter settings and their interdependencies, we attempt to make stacking a suitable choice for mainstream data mining applications.
探索叠加的参数状态空间
集成学习方案是数据挖掘的一个新领域。虽然目前的研究主要集中在提高单一学习算法的性能,但另一种选择是将具有不同偏差的学习器组合起来。堆叠是最著名的这类方案,它试图通过另一种学习算法将学习者的预测或信心结合起来。然而,由于其参数空间大,主要由其他学习算法组成(1)要组合的学习算法集,(2)负责组合的元学习者;(3)使用的元数据类型——置信度或预测。这些参数都不是显而易见的选择。此外,对参数设置与堆垛性能之间的关系知之甚少。通过探索所有堆叠的参数设置及其相互依赖性,我们试图使堆叠成为主流数据挖掘应用的合适选择。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信