{"title":"Look Ahead: Improving the Accuracy of Time-Series Forecasting by Previewing Future Time Features","authors":"Seonmin Kim, Dong-Kyu Chae","doi":"10.1145/3539618.3592013","DOIUrl":null,"url":null,"abstract":"Time-series forecasting has been actively studied and adopted in various real-world domains. Recently there have been two research mainstreams in this area: building Transformer-based architectures such as Informer, Autoformer and Reformer, and developing time-series representation learning frameworks based on contrastive learning such as TS2Vec and CoST. Both efforts have greatly improved the performance of time series forecasting. In this paper, we investigate a novel direction towards improving the forecasting performance even more, which is orthogonal to the aforementioned mainstreams as a model-agnostic scheme. We focus on time stamp embeddings that has been less-focused in the literature. Our idea is simple-yet-effective: based on given current time stamp, we predict embeddings of its near future time stamp and utilize the predicted embeddings in the time-series (value) forecasting task. We believe that if such future time information can be previewed at the time of prediction, they can be utilized by any time-series forecasting models as useful additional information. Our experimental results confirmed that our method consistently and significantly improves the accuracy of the recent Transformer-based models and time-series representation learning frameworks. Our code is available at: https://github.com/sunsunmin/Look_Ahead","PeriodicalId":425056,"journal":{"name":"Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3539618.3592013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Time-series forecasting has been actively studied and adopted in various real-world domains. Recently there have been two research mainstreams in this area: building Transformer-based architectures such as Informer, Autoformer and Reformer, and developing time-series representation learning frameworks based on contrastive learning such as TS2Vec and CoST. Both efforts have greatly improved the performance of time series forecasting. In this paper, we investigate a novel direction towards improving the forecasting performance even more, which is orthogonal to the aforementioned mainstreams as a model-agnostic scheme. We focus on time stamp embeddings that has been less-focused in the literature. Our idea is simple-yet-effective: based on given current time stamp, we predict embeddings of its near future time stamp and utilize the predicted embeddings in the time-series (value) forecasting task. We believe that if such future time information can be previewed at the time of prediction, they can be utilized by any time-series forecasting models as useful additional information. Our experimental results confirmed that our method consistently and significantly improves the accuracy of the recent Transformer-based models and time-series representation learning frameworks. Our code is available at: https://github.com/sunsunmin/Look_Ahead