{"title":"精算应用中数据丰富的经济预测","authors":"Felix Zhu , Yumo Dong , Fei Huang","doi":"10.1016/j.insmatheco.2025.103126","DOIUrl":null,"url":null,"abstract":"<div><div>With the advent of Big Data, machine learning, and artificial intelligence (AI) technologies, actuaries can now develop advanced models in a data-rich environment to achieve better forecasting performance and provide added value in many applications. Traditionally, economic forecasting for actuarial applications is developed using econometric models based on small datasets including only the target variables (usually around 4-6) and their lagged variables. This paper explores the value of economic forecasting using deep learning with a big dataset, Federal Reserve Bank of St Louis (FRED) database, consisting of 121 economic variables and their lagged variables covering periods before, during, and after the global financial crisis (GFC), and during COVID (2019-2021). Four target variables considered in this paper include inflation rate, interest rate, wage rate, and unemployment rate, which are common variables for social security funds forecasting. The proposed model “PCA-Net” combines dimension reduction via principal component analysis (PCA) and Neural Networks (including convolutional neural network (CNN), Long Short-Term Memory (LSTM), and fully-connected layers). PCA-Net generally outperforms the benchmark models based on vector autoregression (VAR) and Wilkie-like models, although the magnitude of its advantage varies across economic variables and forecast horizons. Using conformal prediction, this paper provides prediction intervals to quantify the prediction uncertainty. The model performance is demonstrated using a social security fund forecasting application.</div></div>","PeriodicalId":54974,"journal":{"name":"Insurance Mathematics & Economics","volume":"124 ","pages":"Article 103126"},"PeriodicalIF":2.2000,"publicationDate":"2025-06-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Data-rich economic forecasting for actuarial applications\",\"authors\":\"Felix Zhu , Yumo Dong , Fei Huang\",\"doi\":\"10.1016/j.insmatheco.2025.103126\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>With the advent of Big Data, machine learning, and artificial intelligence (AI) technologies, actuaries can now develop advanced models in a data-rich environment to achieve better forecasting performance and provide added value in many applications. Traditionally, economic forecasting for actuarial applications is developed using econometric models based on small datasets including only the target variables (usually around 4-6) and their lagged variables. This paper explores the value of economic forecasting using deep learning with a big dataset, Federal Reserve Bank of St Louis (FRED) database, consisting of 121 economic variables and their lagged variables covering periods before, during, and after the global financial crisis (GFC), and during COVID (2019-2021). Four target variables considered in this paper include inflation rate, interest rate, wage rate, and unemployment rate, which are common variables for social security funds forecasting. The proposed model “PCA-Net” combines dimension reduction via principal component analysis (PCA) and Neural Networks (including convolutional neural network (CNN), Long Short-Term Memory (LSTM), and fully-connected layers). PCA-Net generally outperforms the benchmark models based on vector autoregression (VAR) and Wilkie-like models, although the magnitude of its advantage varies across economic variables and forecast horizons. Using conformal prediction, this paper provides prediction intervals to quantify the prediction uncertainty. The model performance is demonstrated using a social security fund forecasting application.</div></div>\",\"PeriodicalId\":54974,\"journal\":{\"name\":\"Insurance Mathematics & Economics\",\"volume\":\"124 \",\"pages\":\"Article 103126\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2025-06-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Insurance Mathematics & Economics\",\"FirstCategoryId\":\"96\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167668725000733\",\"RegionNum\":2,\"RegionCategory\":\"经济学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ECONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Insurance Mathematics & Economics","FirstCategoryId":"96","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167668725000733","RegionNum":2,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ECONOMICS","Score":null,"Total":0}
Data-rich economic forecasting for actuarial applications
With the advent of Big Data, machine learning, and artificial intelligence (AI) technologies, actuaries can now develop advanced models in a data-rich environment to achieve better forecasting performance and provide added value in many applications. Traditionally, economic forecasting for actuarial applications is developed using econometric models based on small datasets including only the target variables (usually around 4-6) and their lagged variables. This paper explores the value of economic forecasting using deep learning with a big dataset, Federal Reserve Bank of St Louis (FRED) database, consisting of 121 economic variables and their lagged variables covering periods before, during, and after the global financial crisis (GFC), and during COVID (2019-2021). Four target variables considered in this paper include inflation rate, interest rate, wage rate, and unemployment rate, which are common variables for social security funds forecasting. The proposed model “PCA-Net” combines dimension reduction via principal component analysis (PCA) and Neural Networks (including convolutional neural network (CNN), Long Short-Term Memory (LSTM), and fully-connected layers). PCA-Net generally outperforms the benchmark models based on vector autoregression (VAR) and Wilkie-like models, although the magnitude of its advantage varies across economic variables and forecast horizons. Using conformal prediction, this paper provides prediction intervals to quantify the prediction uncertainty. The model performance is demonstrated using a social security fund forecasting application.
期刊介绍:
Insurance: Mathematics and Economics publishes leading research spanning all fields of actuarial science research. It appears six times per year and is the largest journal in actuarial science research around the world.
Insurance: Mathematics and Economics is an international academic journal that aims to strengthen the communication between individuals and groups who develop and apply research results in actuarial science. The journal feels a particular obligation to facilitate closer cooperation between those who conduct research in insurance mathematics and quantitative insurance economics, and practicing actuaries who are interested in the implementation of the results. To this purpose, Insurance: Mathematics and Economics publishes high-quality articles of broad international interest, concerned with either the theory of insurance mathematics and quantitative insurance economics or the inventive application of it, including empirical or experimental results. Articles that combine several of these aspects are particularly considered.