整合深度学习和计量经济学用于股票价格预测:LSTM、变压器和传统时间序列模型的综合比较

IF 4.9
Eyas Gaffar A. Osman, Faisal A. Otaibi
{"title":"整合深度学习和计量经济学用于股票价格预测:LSTM、变压器和传统时间序列模型的综合比较","authors":"Eyas Gaffar A. Osman,&nbsp;Faisal A. Otaibi","doi":"10.1016/j.mlwa.2025.100730","DOIUrl":null,"url":null,"abstract":"<div><div>This study presents a comprehensive empirical comparison between state-of-the-art deep learning models including Long Short-Term Memory (LSTM) networks, Transformer architectures, and traditional econometric models (ARIMA and VAR) for stock price prediction, with particular focus on performance during the COVID-19 pandemic crisis. Using daily S&amp;P 500 data from 2015 to 2020, we rigorously evaluate model performance across multiple metrics including Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE). Our findings demonstrate that while Transformer models achieve the best overall performance with an RMSE of 41.87 and directional accuracy of 69.1 %, LSTM networks provide an optimal balance between performance (RMSE: 43.25) and computational efficiency. Both deep learning approaches significantly outperform traditional econometric methods, with LSTM achieving a 53.3 % reduction in RMSE compared to ARIMA models. During the COVID-19 crisis period, deep learning models demonstrated exceptional robustness, with Transformers showing only 45 % performance degradation compared to over 100 % degradation in traditional models. Through comprehensive attention analysis, we provide insights into model interpretability, revealing adaptive behavior across market regimes. The study contributes to the growing literature on artificial intelligence applications in finance by providing rigorous empirical evidence for the superiority of modern deep learning approaches, while addressing the critical need for comparison with cutting-edge Transformer architectures that have revolutionized machine learning in recent years.</div></div>","PeriodicalId":74093,"journal":{"name":"Machine learning with applications","volume":"22 ","pages":"Article 100730"},"PeriodicalIF":4.9000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Integrating deep learning and econometrics for stock price prediction: A comprehensive comparison of LSTM, transformers, and traditional time series models\",\"authors\":\"Eyas Gaffar A. Osman,&nbsp;Faisal A. Otaibi\",\"doi\":\"10.1016/j.mlwa.2025.100730\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This study presents a comprehensive empirical comparison between state-of-the-art deep learning models including Long Short-Term Memory (LSTM) networks, Transformer architectures, and traditional econometric models (ARIMA and VAR) for stock price prediction, with particular focus on performance during the COVID-19 pandemic crisis. Using daily S&amp;P 500 data from 2015 to 2020, we rigorously evaluate model performance across multiple metrics including Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE). Our findings demonstrate that while Transformer models achieve the best overall performance with an RMSE of 41.87 and directional accuracy of 69.1 %, LSTM networks provide an optimal balance between performance (RMSE: 43.25) and computational efficiency. Both deep learning approaches significantly outperform traditional econometric methods, with LSTM achieving a 53.3 % reduction in RMSE compared to ARIMA models. During the COVID-19 crisis period, deep learning models demonstrated exceptional robustness, with Transformers showing only 45 % performance degradation compared to over 100 % degradation in traditional models. Through comprehensive attention analysis, we provide insights into model interpretability, revealing adaptive behavior across market regimes. The study contributes to the growing literature on artificial intelligence applications in finance by providing rigorous empirical evidence for the superiority of modern deep learning approaches, while addressing the critical need for comparison with cutting-edge Transformer architectures that have revolutionized machine learning in recent years.</div></div>\",\"PeriodicalId\":74093,\"journal\":{\"name\":\"Machine learning with applications\",\"volume\":\"22 \",\"pages\":\"Article 100730\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2025-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Machine learning with applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666827025001136\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine learning with applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666827025001136","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本研究对最先进的深度学习模型(包括长短期记忆(LSTM)网络、Transformer架构)和传统计量经济学模型(ARIMA和VAR)进行了全面的实证比较,并特别关注了2019冠状病毒病大流行危机期间的表现。使用2015年至2020年的每日标准普尔500指数数据,我们严格评估了多个指标的模型性能,包括均方根误差(RMSE)、平均绝对误差(MAE)和平均绝对百分比误差(MAPE)。我们的研究结果表明,虽然Transformer模型的整体性能最佳,RMSE为41.87,方向精度为69.1%,但LSTM网络在性能(RMSE: 43.25)和计算效率之间提供了最佳平衡。两种深度学习方法都明显优于传统的计量经济学方法,与ARIMA模型相比,LSTM的RMSE降低了53.3%。在2019冠状病毒病危机期间,深度学习模型表现出出色的鲁棒性,变形金刚的性能仅下降45%,而传统模型的性能下降超过100%。通过全面的注意力分析,我们提供了对模型可解释性的见解,揭示了市场机制中的适应性行为。该研究通过为现代深度学习方法的优越性提供严格的经验证据,为人工智能在金融领域的应用提供了越来越多的文献,同时解决了与近年来彻底改变机器学习的尖端Transformer架构进行比较的关键需求。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Integrating deep learning and econometrics for stock price prediction: A comprehensive comparison of LSTM, transformers, and traditional time series models
This study presents a comprehensive empirical comparison between state-of-the-art deep learning models including Long Short-Term Memory (LSTM) networks, Transformer architectures, and traditional econometric models (ARIMA and VAR) for stock price prediction, with particular focus on performance during the COVID-19 pandemic crisis. Using daily S&P 500 data from 2015 to 2020, we rigorously evaluate model performance across multiple metrics including Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE). Our findings demonstrate that while Transformer models achieve the best overall performance with an RMSE of 41.87 and directional accuracy of 69.1 %, LSTM networks provide an optimal balance between performance (RMSE: 43.25) and computational efficiency. Both deep learning approaches significantly outperform traditional econometric methods, with LSTM achieving a 53.3 % reduction in RMSE compared to ARIMA models. During the COVID-19 crisis period, deep learning models demonstrated exceptional robustness, with Transformers showing only 45 % performance degradation compared to over 100 % degradation in traditional models. Through comprehensive attention analysis, we provide insights into model interpretability, revealing adaptive behavior across market regimes. The study contributes to the growing literature on artificial intelligence applications in finance by providing rigorous empirical evidence for the superiority of modern deep learning approaches, while addressing the critical need for comparison with cutting-edge Transformer architectures that have revolutionized machine learning in recent years.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Machine learning with applications
Machine learning with applications Management Science and Operations Research, Artificial Intelligence, Computer Science Applications
自引率
0.00%
发文量
0
审稿时长
98 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信