Two-stage attentional temporal convolution and LSTM model for financial data forecasting

Lifang Chen, Xiaowan Li, Zhenping Xie
{"title":"Two-stage attentional temporal convolution and LSTM model for financial data forecasting","authors":"Lifang Chen, Xiaowan Li, Zhenping Xie","doi":"10.1117/12.2682556","DOIUrl":null,"url":null,"abstract":"Financial time series usually consist of multiple time series, and financial time series data forecasting models use the historical data plays of multiple driving series to predict the future values of the target series. In recent years, attention-based Long and Short-Term Memory (LSTM) neural networks and Temporal Convolutional Networks (TCN) have been widely used in time series forecasting. In this paper, we propose a two-stage attention-based TCN and LSTM hybrid forecasting model, in order to better obtain the spatial correlation of driving sequences, we used causal self-attention to obtain the spatial attention weights of driving sequences, then use TCN to extract the short-term features of the series in the first stage, in the second stage, adding the temporal attention module computes the sequence adaptively assigning weights to the input sequence for the current and historical moments, and finally use LSTM to capture the long-term dependence of the time-series data. We used the NASDAQ 100 stock dataset and the financial time series of CSI 300 companies to measure the performance of the proposed model in financial data forecasting.","PeriodicalId":177416,"journal":{"name":"Conference on Electronic Information Engineering and Data Processing","volume":"12700 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference on Electronic Information Engineering and Data Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2682556","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Financial time series usually consist of multiple time series, and financial time series data forecasting models use the historical data plays of multiple driving series to predict the future values of the target series. In recent years, attention-based Long and Short-Term Memory (LSTM) neural networks and Temporal Convolutional Networks (TCN) have been widely used in time series forecasting. In this paper, we propose a two-stage attention-based TCN and LSTM hybrid forecasting model, in order to better obtain the spatial correlation of driving sequences, we used causal self-attention to obtain the spatial attention weights of driving sequences, then use TCN to extract the short-term features of the series in the first stage, in the second stage, adding the temporal attention module computes the sequence adaptively assigning weights to the input sequence for the current and historical moments, and finally use LSTM to capture the long-term dependence of the time-series data. We used the NASDAQ 100 stock dataset and the financial time series of CSI 300 companies to measure the performance of the proposed model in financial data forecasting.
两阶段注意时间卷积和LSTM模型用于金融数据预测
金融时间序列通常由多个时间序列组成,金融时间序列数据预测模型利用多个驱动序列的历史数据来预测目标序列的未来值。近年来,基于注意的长短期记忆(LSTM)神经网络和时间卷积网络(TCN)在时间序列预测中得到了广泛的应用。本文提出了一种基于两阶段注意力的TCN和LSTM混合预测模型,为了更好地获得驾驶序列的空间相关性,我们使用因果自注意来获得驾驶序列的空间注意权值,然后在第一阶段使用TCN提取序列的短期特征,在第二阶段使用TCN提取序列的短期特征。加入时间关注模块,自适应计算序列,为输入序列分配当前时刻和历史时刻的权重,最后利用LSTM捕获时间序列数据的长期依赖关系。我们使用纳斯达克100股票数据集和沪深300公司的财务时间序列来衡量所提出的模型在财务数据预测中的表现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信