Mixture of experts leveraging Informer and LSTM variants for enhanced daily streamflow forecasting

IF 5.9 1区 地球科学 Q1 ENGINEERING, CIVIL
Zerong Rong , Wei Sun , Yutong Xie , Zexi Huang , Xinlin Chen
{"title":"Mixture of experts leveraging Informer and LSTM variants for enhanced daily streamflow forecasting","authors":"Zerong Rong ,&nbsp;Wei Sun ,&nbsp;Yutong Xie ,&nbsp;Zexi Huang ,&nbsp;Xinlin Chen","doi":"10.1016/j.jhydrol.2025.132737","DOIUrl":null,"url":null,"abstract":"<div><div>Streamflow forecasting is of paramount importance for water resources management and flood prevention. Machine learning, particularly deep learning, has had significant success in hydrological forecasting. However, there is still a desire for newer single-type and integrated architectures to further enhance the accuracy and reliability of forecasts. Recently, Transformer-based models have emerged as promising tools, and their effectiveness in streamflow modeling tasks warrants further investigation. The Mixture of Experts (MoE) model has also demonstrated potential in other fields, but its application in the hydrological domain remains relatively limited. This study presents an innovative streamflow forecasting model for the Quinebaug River Basin in Connecticut, USA, based on the MoE framework. Firstly, the hyperparameters of expert models, including LSTM, GRU, LSTM-Sequence to Sequence-Attention, and Informer, with lead times ranging from 1 to 8 days, are optimized using the grid search method. Subsequently, Random Forest, LSTM, and Transformer are used as routers to construct 4-class and 2-class MoE frameworks. Finally, the classified outputs are integrated to synthesize the streamflow forecasting results. The results indicate that the Informer model outperforms other benchmark models in all forecast periods, especially in shorter ones. Both 4-class and 2-class MoE can improve the streamflow forecasting results of the optimal sub-model to some extent: when the lead time reaches 5 days or more, the NSE increases by nearly or more than 3 %. This study highlights that the MoE framework can improve daily streamflow forecasting accuracy by integrating the strengths of different experts, although the router tends to prioritize models with superior performance during the classification process.</div></div>","PeriodicalId":362,"journal":{"name":"Journal of Hydrology","volume":"653 ","pages":"Article 132737"},"PeriodicalIF":5.9000,"publicationDate":"2025-01-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Hydrology","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022169425000757","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0

Abstract

Streamflow forecasting is of paramount importance for water resources management and flood prevention. Machine learning, particularly deep learning, has had significant success in hydrological forecasting. However, there is still a desire for newer single-type and integrated architectures to further enhance the accuracy and reliability of forecasts. Recently, Transformer-based models have emerged as promising tools, and their effectiveness in streamflow modeling tasks warrants further investigation. The Mixture of Experts (MoE) model has also demonstrated potential in other fields, but its application in the hydrological domain remains relatively limited. This study presents an innovative streamflow forecasting model for the Quinebaug River Basin in Connecticut, USA, based on the MoE framework. Firstly, the hyperparameters of expert models, including LSTM, GRU, LSTM-Sequence to Sequence-Attention, and Informer, with lead times ranging from 1 to 8 days, are optimized using the grid search method. Subsequently, Random Forest, LSTM, and Transformer are used as routers to construct 4-class and 2-class MoE frameworks. Finally, the classified outputs are integrated to synthesize the streamflow forecasting results. The results indicate that the Informer model outperforms other benchmark models in all forecast periods, especially in shorter ones. Both 4-class and 2-class MoE can improve the streamflow forecasting results of the optimal sub-model to some extent: when the lead time reaches 5 days or more, the NSE increases by nearly or more than 3 %. This study highlights that the MoE framework can improve daily streamflow forecasting accuracy by integrating the strengths of different experts, although the router tends to prioritize models with superior performance during the classification process.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Hydrology
Journal of Hydrology 地学-地球科学综合
CiteScore
11.00
自引率
12.50%
发文量
1309
审稿时长
7.5 months
期刊介绍: The Journal of Hydrology publishes original research papers and comprehensive reviews in all the subfields of the hydrological sciences including water based management and policy issues that impact on economics and society. These comprise, but are not limited to the physical, chemical, biogeochemical, stochastic and systems aspects of surface and groundwater hydrology, hydrometeorology and hydrogeology. Relevant topics incorporating the insights and methodologies of disciplines such as climatology, water resource systems, hydraulics, agrohydrology, geomorphology, soil science, instrumentation and remote sensing, civil and environmental engineering are included. Social science perspectives on hydrological problems such as resource and ecological economics, environmental sociology, psychology and behavioural science, management and policy analysis are also invited. Multi-and interdisciplinary analyses of hydrological problems are within scope. The science published in the Journal of Hydrology is relevant to catchment scales rather than exclusively to a local scale or site.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信