{"title":"Application and Optimization of Long Short-term Memory in Time Series Forcasting","authors":"Chenen Jin","doi":"10.1109/CECCC56460.2022.10069825","DOIUrl":null,"url":null,"abstract":"Learning to keep information over the long time intervals by backpropagation usually spend a lot of time. Therefore, long short-term memory (LSTM) is introduced in this task with novelty and efficiency. With the advantage of truncating the gradient harmlessly, LSTM is able to connect thousands of time steps by forcing a constant error flow within special cells contains constant error carousels. The multiplication gate unit learns open and close access to a constant stream of errors. In comparison with recurrent cascade correlation, real-time recurrent learning, and chunking of neural sequence, LSTM brings about more successful results and has a short learning time. LSTM is also able to solve complicated, artificially long-lag tasks. Based on its superiority, LSTM network serves as a useful tool in time series forecasting. We use the LSTM network to forecast the cases of varicella in the future. And the current LSTM network is optimized to increase the efficiency. Finally, the training time is decreased and the accuracy of trained network is increased simultaneously when the quantity of hidden units is changed from 200 to 100.","PeriodicalId":155272,"journal":{"name":"2022 International Communication Engineering and Cloud Computing Conference (CECCC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Communication Engineering and Cloud Computing Conference (CECCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CECCC56460.2022.10069825","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Learning to keep information over the long time intervals by backpropagation usually spend a lot of time. Therefore, long short-term memory (LSTM) is introduced in this task with novelty and efficiency. With the advantage of truncating the gradient harmlessly, LSTM is able to connect thousands of time steps by forcing a constant error flow within special cells contains constant error carousels. The multiplication gate unit learns open and close access to a constant stream of errors. In comparison with recurrent cascade correlation, real-time recurrent learning, and chunking of neural sequence, LSTM brings about more successful results and has a short learning time. LSTM is also able to solve complicated, artificially long-lag tasks. Based on its superiority, LSTM network serves as a useful tool in time series forecasting. We use the LSTM network to forecast the cases of varicella in the future. And the current LSTM network is optimized to increase the efficiency. Finally, the training time is decreased and the accuracy of trained network is increased simultaneously when the quantity of hidden units is changed from 200 to 100.