Meta-learning for time series forecasting in the NN GC1 competition

Christiane Lemke, B. Gabrys
{"title":"Meta-learning for time series forecasting in the NN GC1 competition","authors":"Christiane Lemke, B. Gabrys","doi":"10.1109/FUZZY.2010.5584001","DOIUrl":null,"url":null,"abstract":"There are no algorithms that generally perform better or worse than random when looking at all possible data sets according to the no-free-lunch theorem. A specific forecasting method will hence naturally have different performances in different empirical studies. This makes it impossible to draw general conclusions, however, there will of course be specific problems for which one algorithm performs better than another in practice. Meta-learning exploits this fact by linking characteristics of the data set to the performances of methods, adapting the selection or combination of base methods to a specific problem. This contribution describes an approach using meta-learning for time series forecasting in the NN GC1 competition. In order to generate bigger and more reliable meta-data set, data of the past NN3 and NN5 competitions have been included. A pool of individual forecasting and combination models are combined using a ranking algorithm with weights being determined by past performance on similar series.","PeriodicalId":377799,"journal":{"name":"International Conference on Fuzzy Systems","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Fuzzy Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FUZZY.2010.5584001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 22

Abstract

There are no algorithms that generally perform better or worse than random when looking at all possible data sets according to the no-free-lunch theorem. A specific forecasting method will hence naturally have different performances in different empirical studies. This makes it impossible to draw general conclusions, however, there will of course be specific problems for which one algorithm performs better than another in practice. Meta-learning exploits this fact by linking characteristics of the data set to the performances of methods, adapting the selection or combination of base methods to a specific problem. This contribution describes an approach using meta-learning for time series forecasting in the NN GC1 competition. In order to generate bigger and more reliable meta-data set, data of the past NN3 and NN5 competitions have been included. A pool of individual forecasting and combination models are combined using a ranking algorithm with weights being determined by past performance on similar series.
神经网络GC1竞赛中用于时间序列预测的元学习
根据无免费午餐定理,在查看所有可能的数据集时,通常没有比随机更好或更差的算法。因此,一种特定的预测方法在不同的实证研究中自然会有不同的表现。这使得不可能得出一般的结论,然而,当然会有特定的问题,一个算法在实践中表现得比另一个更好。元学习利用这一事实,将数据集的特征与方法的性能联系起来,根据具体问题调整基本方法的选择或组合。这篇文章描述了一种在NN GC1竞赛中使用元学习进行时间序列预测的方法。为了生成更大更可靠的元数据集,我们纳入了过去NN3和NN5比赛的数据。使用排序算法将单个预测模型和组合模型组合在一起,权重由过去在类似序列上的表现确定。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信