K. Sarpong, Bei Hui, Xue Zhou, Rutherford Agbeshi Patamia, Edwin Kwadwo Tenagyei
{"title":"在无噪声环境下用多注意网络完善短期股票预测","authors":"K. Sarpong, Bei Hui, Xue Zhou, Rutherford Agbeshi Patamia, Edwin Kwadwo Tenagyei","doi":"10.1145/3448734.3450779","DOIUrl":null,"url":null,"abstract":"The extreme uncertainties and volatile nature of the stock markets is an extensive field of study. The key to exploiting time series modelling strategies is crucial to achieving greater stock market efficiency. Even though various theoretical propositions in deep learning have developed, a few can capture long term temporal dependencies information and select the sailing series to make accurate forecasting. To overcome the problem, we propose wavelet two-stage attention-based long short term memory (WTS-ALSTM) for financial time series prediction. We use the wavelet transform decomposing to perform signal analysis and signal reconstruction of historical stock data for the noise reduction, extracts and train its characteristics, and sets the stock market forecast model. WTSALSTM model incorporates the resilient and non-linear interaction in the series, before introducing the input attention via past encoder hidden states, and temporal attention mechanism through the decoder stage at all-time steps across all the encoder hidden states. We benchmark the final results with twelve different models on DJIA, HSI, and S&P 500 datasets. Experimental results on the above datasets have illustrated that the proposed model can achieve competitive prediction performance in their metrics compared with other baseline models.","PeriodicalId":105999,"journal":{"name":"The 2nd International Conference on Computing and Data Science","volume":"91 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Perfecting Short-term Stock Predictions with Multi-Attention Networks in Noise-free Settings\",\"authors\":\"K. Sarpong, Bei Hui, Xue Zhou, Rutherford Agbeshi Patamia, Edwin Kwadwo Tenagyei\",\"doi\":\"10.1145/3448734.3450779\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The extreme uncertainties and volatile nature of the stock markets is an extensive field of study. The key to exploiting time series modelling strategies is crucial to achieving greater stock market efficiency. Even though various theoretical propositions in deep learning have developed, a few can capture long term temporal dependencies information and select the sailing series to make accurate forecasting. To overcome the problem, we propose wavelet two-stage attention-based long short term memory (WTS-ALSTM) for financial time series prediction. We use the wavelet transform decomposing to perform signal analysis and signal reconstruction of historical stock data for the noise reduction, extracts and train its characteristics, and sets the stock market forecast model. WTSALSTM model incorporates the resilient and non-linear interaction in the series, before introducing the input attention via past encoder hidden states, and temporal attention mechanism through the decoder stage at all-time steps across all the encoder hidden states. We benchmark the final results with twelve different models on DJIA, HSI, and S&P 500 datasets. Experimental results on the above datasets have illustrated that the proposed model can achieve competitive prediction performance in their metrics compared with other baseline models.\",\"PeriodicalId\":105999,\"journal\":{\"name\":\"The 2nd International Conference on Computing and Data Science\",\"volume\":\"91 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 2nd International Conference on Computing and Data Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3448734.3450779\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 2nd International Conference on Computing and Data Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3448734.3450779","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Perfecting Short-term Stock Predictions with Multi-Attention Networks in Noise-free Settings
The extreme uncertainties and volatile nature of the stock markets is an extensive field of study. The key to exploiting time series modelling strategies is crucial to achieving greater stock market efficiency. Even though various theoretical propositions in deep learning have developed, a few can capture long term temporal dependencies information and select the sailing series to make accurate forecasting. To overcome the problem, we propose wavelet two-stage attention-based long short term memory (WTS-ALSTM) for financial time series prediction. We use the wavelet transform decomposing to perform signal analysis and signal reconstruction of historical stock data for the noise reduction, extracts and train its characteristics, and sets the stock market forecast model. WTSALSTM model incorporates the resilient and non-linear interaction in the series, before introducing the input attention via past encoder hidden states, and temporal attention mechanism through the decoder stage at all-time steps across all the encoder hidden states. We benchmark the final results with twelve different models on DJIA, HSI, and S&P 500 datasets. Experimental results on the above datasets have illustrated that the proposed model can achieve competitive prediction performance in their metrics compared with other baseline models.