{"title":"利用变压器加强水文建模:24 小时流量预测案例研究","authors":"B. Demiray, M. Sit, Omer Mermer, Ibrahim Demir","doi":"10.2166/wst.2024.110","DOIUrl":null,"url":null,"abstract":"\n \n In this paper, we address the critical task of 24-h streamflow forecasting using advanced deep-learning models, with a primary focus on the Transformer architecture which has seen limited application in this specific task. We compare the performance of five different models, including Persistence, long short-term memory (LSTM), Seq2Seq, GRU, and Transformer, across four distinct regions. The evaluation is based on three performance metrics: Nash–Sutcliffe Efficiency (NSE), Pearson's r, and normalized root mean square error (NRMSE). Additionally, we investigate the impact of two data extension methods: zero-padding and persistence, on the model's predictive capabilities. Our findings highlight the Transformer's superiority in capturing complex temporal dependencies and patterns in the streamflow data, outperforming all other models in terms of both accuracy and reliability. Specifically, the Transformer model demonstrated a substantial improvement in NSE scores by up to 20% compared to other models. The study's insights emphasize the significance of leveraging advanced deep learning techniques, such as the Transformer, in hydrological modeling and streamflow forecasting for effective water resource management and flood prediction.","PeriodicalId":298320,"journal":{"name":"Water Science & Technology","volume":"4 2","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enhancing hydrological modeling with transformers: a case study for 24-h streamflow prediction\",\"authors\":\"B. Demiray, M. Sit, Omer Mermer, Ibrahim Demir\",\"doi\":\"10.2166/wst.2024.110\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n \\n In this paper, we address the critical task of 24-h streamflow forecasting using advanced deep-learning models, with a primary focus on the Transformer architecture which has seen limited application in this specific task. We compare the performance of five different models, including Persistence, long short-term memory (LSTM), Seq2Seq, GRU, and Transformer, across four distinct regions. The evaluation is based on three performance metrics: Nash–Sutcliffe Efficiency (NSE), Pearson's r, and normalized root mean square error (NRMSE). Additionally, we investigate the impact of two data extension methods: zero-padding and persistence, on the model's predictive capabilities. Our findings highlight the Transformer's superiority in capturing complex temporal dependencies and patterns in the streamflow data, outperforming all other models in terms of both accuracy and reliability. Specifically, the Transformer model demonstrated a substantial improvement in NSE scores by up to 20% compared to other models. The study's insights emphasize the significance of leveraging advanced deep learning techniques, such as the Transformer, in hydrological modeling and streamflow forecasting for effective water resource management and flood prediction.\",\"PeriodicalId\":298320,\"journal\":{\"name\":\"Water Science & Technology\",\"volume\":\"4 2\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Water Science & Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2166/wst.2024.110\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Water Science & Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2166/wst.2024.110","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Enhancing hydrological modeling with transformers: a case study for 24-h streamflow prediction
In this paper, we address the critical task of 24-h streamflow forecasting using advanced deep-learning models, with a primary focus on the Transformer architecture which has seen limited application in this specific task. We compare the performance of five different models, including Persistence, long short-term memory (LSTM), Seq2Seq, GRU, and Transformer, across four distinct regions. The evaluation is based on three performance metrics: Nash–Sutcliffe Efficiency (NSE), Pearson's r, and normalized root mean square error (NRMSE). Additionally, we investigate the impact of two data extension methods: zero-padding and persistence, on the model's predictive capabilities. Our findings highlight the Transformer's superiority in capturing complex temporal dependencies and patterns in the streamflow data, outperforming all other models in terms of both accuracy and reliability. Specifically, the Transformer model demonstrated a substantial improvement in NSE scores by up to 20% compared to other models. The study's insights emphasize the significance of leveraging advanced deep learning techniques, such as the Transformer, in hydrological modeling and streamflow forecasting for effective water resource management and flood prediction.