{"title":"整合深度学习和计量经济学用于股票价格预测:LSTM、变压器和传统时间序列模型的综合比较","authors":"Eyas Gaffar A. Osman, Faisal A. Otaibi","doi":"10.1016/j.mlwa.2025.100730","DOIUrl":null,"url":null,"abstract":"<div><div>This study presents a comprehensive empirical comparison between state-of-the-art deep learning models including Long Short-Term Memory (LSTM) networks, Transformer architectures, and traditional econometric models (ARIMA and VAR) for stock price prediction, with particular focus on performance during the COVID-19 pandemic crisis. Using daily S&P 500 data from 2015 to 2020, we rigorously evaluate model performance across multiple metrics including Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE). Our findings demonstrate that while Transformer models achieve the best overall performance with an RMSE of 41.87 and directional accuracy of 69.1 %, LSTM networks provide an optimal balance between performance (RMSE: 43.25) and computational efficiency. Both deep learning approaches significantly outperform traditional econometric methods, with LSTM achieving a 53.3 % reduction in RMSE compared to ARIMA models. During the COVID-19 crisis period, deep learning models demonstrated exceptional robustness, with Transformers showing only 45 % performance degradation compared to over 100 % degradation in traditional models. Through comprehensive attention analysis, we provide insights into model interpretability, revealing adaptive behavior across market regimes. The study contributes to the growing literature on artificial intelligence applications in finance by providing rigorous empirical evidence for the superiority of modern deep learning approaches, while addressing the critical need for comparison with cutting-edge Transformer architectures that have revolutionized machine learning in recent years.</div></div>","PeriodicalId":74093,"journal":{"name":"Machine learning with applications","volume":"22 ","pages":"Article 100730"},"PeriodicalIF":4.9000,"publicationDate":"2025-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Integrating deep learning and econometrics for stock price prediction: A comprehensive comparison of LSTM, transformers, and traditional time series models\",\"authors\":\"Eyas Gaffar A. Osman, Faisal A. Otaibi\",\"doi\":\"10.1016/j.mlwa.2025.100730\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This study presents a comprehensive empirical comparison between state-of-the-art deep learning models including Long Short-Term Memory (LSTM) networks, Transformer architectures, and traditional econometric models (ARIMA and VAR) for stock price prediction, with particular focus on performance during the COVID-19 pandemic crisis. Using daily S&P 500 data from 2015 to 2020, we rigorously evaluate model performance across multiple metrics including Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE). Our findings demonstrate that while Transformer models achieve the best overall performance with an RMSE of 41.87 and directional accuracy of 69.1 %, LSTM networks provide an optimal balance between performance (RMSE: 43.25) and computational efficiency. Both deep learning approaches significantly outperform traditional econometric methods, with LSTM achieving a 53.3 % reduction in RMSE compared to ARIMA models. During the COVID-19 crisis period, deep learning models demonstrated exceptional robustness, with Transformers showing only 45 % performance degradation compared to over 100 % degradation in traditional models. Through comprehensive attention analysis, we provide insights into model interpretability, revealing adaptive behavior across market regimes. The study contributes to the growing literature on artificial intelligence applications in finance by providing rigorous empirical evidence for the superiority of modern deep learning approaches, while addressing the critical need for comparison with cutting-edge Transformer architectures that have revolutionized machine learning in recent years.</div></div>\",\"PeriodicalId\":74093,\"journal\":{\"name\":\"Machine learning with applications\",\"volume\":\"22 \",\"pages\":\"Article 100730\"},\"PeriodicalIF\":4.9000,\"publicationDate\":\"2025-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Machine learning with applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666827025001136\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine learning with applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666827025001136","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Integrating deep learning and econometrics for stock price prediction: A comprehensive comparison of LSTM, transformers, and traditional time series models
This study presents a comprehensive empirical comparison between state-of-the-art deep learning models including Long Short-Term Memory (LSTM) networks, Transformer architectures, and traditional econometric models (ARIMA and VAR) for stock price prediction, with particular focus on performance during the COVID-19 pandemic crisis. Using daily S&P 500 data from 2015 to 2020, we rigorously evaluate model performance across multiple metrics including Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Mean Absolute Percentage Error (MAPE). Our findings demonstrate that while Transformer models achieve the best overall performance with an RMSE of 41.87 and directional accuracy of 69.1 %, LSTM networks provide an optimal balance between performance (RMSE: 43.25) and computational efficiency. Both deep learning approaches significantly outperform traditional econometric methods, with LSTM achieving a 53.3 % reduction in RMSE compared to ARIMA models. During the COVID-19 crisis period, deep learning models demonstrated exceptional robustness, with Transformers showing only 45 % performance degradation compared to over 100 % degradation in traditional models. Through comprehensive attention analysis, we provide insights into model interpretability, revealing adaptive behavior across market regimes. The study contributes to the growing literature on artificial intelligence applications in finance by providing rigorous empirical evidence for the superiority of modern deep learning approaches, while addressing the critical need for comparison with cutting-edge Transformer architectures that have revolutionized machine learning in recent years.