递归神经网络在因果时间序列预测中的生成表示学习

Georgios Chatziparaskevas;Ioannis Mademlis;Ioannis Pitas
{"title":"递归神经网络在因果时间序列预测中的生成表示学习","authors":"Georgios Chatziparaskevas;Ioannis Mademlis;Ioannis Pitas","doi":"10.1109/TAI.2024.3446465","DOIUrl":null,"url":null,"abstract":"Feed-forward deep neural networks (DNNs) are the state of the art in timeseries forecasting. A particularly significant scenario is the causal one: when an arbitrary subset of variables of a given multivariate timeseries is specified as forecasting target, with the remaining ones (exogenous variables) \n<italic>causing</i>\n the target at each time instance. Then, the goal is to predict a temporal window of future target values, given a window of historical exogenous values. To this end, this article proposes a novel deep recurrent neural architecture, called generative-regressing recurrent neural network (GRRNN), which surpasses competing ones in causal forecasting evaluation metrics, by smartly combining generative learning and regression. During training, the generative module learns to synthesize historical target timeseries from historical exogenous inputs via conditional adversarial learning, thus internally encoding the input timeseries into semantically meaningful features. During a forward pass, these features are passed over as input to the regression module, which outputs the actual future target forecasts in a sequence-to-sequence fashion. Thus, the task of timeseries generation is synergistically combined with the task of timeseries forecasting, under an end-to-end multitask training setting. Methodologically, GRRNN contributes a novel augmentation of pure supervised learning, tailored to causal timeseries forecasting, which essentially forces the generative module to transform the historical exogenous timeseries to a more appropriate representation, before feeding it as input to the actual forecasting regressor. Extensive experimental evaluation on relevant public datasets obtained from disparate fields, ranging from air pollution data to sentiment analysis of social media posts, confirms that GRRNN achieves top performance in multistep long-term forecasting.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"5 12","pages":"6412-6425"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Generative Representation Learning in Recurrent Neural Networks for Causal Timeseries Forecasting\",\"authors\":\"Georgios Chatziparaskevas;Ioannis Mademlis;Ioannis Pitas\",\"doi\":\"10.1109/TAI.2024.3446465\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Feed-forward deep neural networks (DNNs) are the state of the art in timeseries forecasting. A particularly significant scenario is the causal one: when an arbitrary subset of variables of a given multivariate timeseries is specified as forecasting target, with the remaining ones (exogenous variables) \\n<italic>causing</i>\\n the target at each time instance. Then, the goal is to predict a temporal window of future target values, given a window of historical exogenous values. To this end, this article proposes a novel deep recurrent neural architecture, called generative-regressing recurrent neural network (GRRNN), which surpasses competing ones in causal forecasting evaluation metrics, by smartly combining generative learning and regression. During training, the generative module learns to synthesize historical target timeseries from historical exogenous inputs via conditional adversarial learning, thus internally encoding the input timeseries into semantically meaningful features. During a forward pass, these features are passed over as input to the regression module, which outputs the actual future target forecasts in a sequence-to-sequence fashion. Thus, the task of timeseries generation is synergistically combined with the task of timeseries forecasting, under an end-to-end multitask training setting. Methodologically, GRRNN contributes a novel augmentation of pure supervised learning, tailored to causal timeseries forecasting, which essentially forces the generative module to transform the historical exogenous timeseries to a more appropriate representation, before feeding it as input to the actual forecasting regressor. Extensive experimental evaluation on relevant public datasets obtained from disparate fields, ranging from air pollution data to sentiment analysis of social media posts, confirms that GRRNN achieves top performance in multistep long-term forecasting.\",\"PeriodicalId\":73305,\"journal\":{\"name\":\"IEEE transactions on artificial intelligence\",\"volume\":\"5 12\",\"pages\":\"6412-6425\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on artificial intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10643032/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10643032/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

前馈深度神经网络(dnn)是时间序列预测的最新技术。一个特别重要的场景是因果关系:当给定多变量时间序列的任意变量子集被指定为预测目标时,其余的变量(外生变量)在每个时间实例中导致目标。然后,目标是在给定历史外生值窗口的情况下预测未来目标值的时间窗口。为此,本文提出了一种新的深度递归神经结构,称为生成回归递归神经网络(GRRNN),它通过巧妙地结合生成学习和回归,在因果预测评估指标方面超越了竞争对手。在训练过程中,生成模块通过条件对抗学习,学习从历史外生输入合成历史目标时间序列,从而将输入时间序列内部编码为语义上有意义的特征。在向前传递期间,这些特征作为输入传递给回归模块,回归模块以序列到序列的方式输出实际的未来目标预测。因此,在端到端多任务训练设置下,时间序列生成任务与时间序列预测任务协同结合。在方法上,GRRNN为纯监督学习提供了一种新的增强,为因果时间序列预测量身定制,这本质上迫使生成模块在将历史外生时间序列作为输入输入到实际预测回归器之前,将其转换为更合适的表示。从不同领域(从空气污染数据到社交媒体帖子的情绪分析)获得的相关公共数据集进行了广泛的实验评估,证实了GRRNN在多步骤长期预测中取得了最佳性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Generative Representation Learning in Recurrent Neural Networks for Causal Timeseries Forecasting
Feed-forward deep neural networks (DNNs) are the state of the art in timeseries forecasting. A particularly significant scenario is the causal one: when an arbitrary subset of variables of a given multivariate timeseries is specified as forecasting target, with the remaining ones (exogenous variables) causing the target at each time instance. Then, the goal is to predict a temporal window of future target values, given a window of historical exogenous values. To this end, this article proposes a novel deep recurrent neural architecture, called generative-regressing recurrent neural network (GRRNN), which surpasses competing ones in causal forecasting evaluation metrics, by smartly combining generative learning and regression. During training, the generative module learns to synthesize historical target timeseries from historical exogenous inputs via conditional adversarial learning, thus internally encoding the input timeseries into semantically meaningful features. During a forward pass, these features are passed over as input to the regression module, which outputs the actual future target forecasts in a sequence-to-sequence fashion. Thus, the task of timeseries generation is synergistically combined with the task of timeseries forecasting, under an end-to-end multitask training setting. Methodologically, GRRNN contributes a novel augmentation of pure supervised learning, tailored to causal timeseries forecasting, which essentially forces the generative module to transform the historical exogenous timeseries to a more appropriate representation, before feeding it as input to the actual forecasting regressor. Extensive experimental evaluation on relevant public datasets obtained from disparate fields, ranging from air pollution data to sentiment analysis of social media posts, confirms that GRRNN achieves top performance in multistep long-term forecasting.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.70
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信