Monthly streamflow forecasting with temporal-periodic transformer

IF 5.9 1区 地球科学 Q1 ENGINEERING, CIVIL
Hanlin Yin , Qirui Zheng , Chenxu Wei , Congcong Liang , Minhao Fan , Xiuwei Zhang , Yanning Zhang
{"title":"Monthly streamflow forecasting with temporal-periodic transformer","authors":"Hanlin Yin ,&nbsp;Qirui Zheng ,&nbsp;Chenxu Wei ,&nbsp;Congcong Liang ,&nbsp;Minhao Fan ,&nbsp;Xiuwei Zhang ,&nbsp;Yanning Zhang","doi":"10.1016/j.jhydrol.2025.133308","DOIUrl":null,"url":null,"abstract":"<div><div>Monthly streamflow forecasting is important for water resources planning and management in hydrology. In recent years, deep learning based data-driven approaches have received significant attention, especially the Long Short-Term Memory (LSTM) and the Transformer. Among the above two sorts of models for such a task, hardly any model considers the periodic information from the same month of different years directly. This periodic information is important for monthly streamflow forecasting and we propose a periodic attention mechanism to explore it in this paper. Specifically, we propose a novel Temporal-Periodic Transformer (TPT) model, which has temporal-periodic attention modules exploring the temporal information and the periodic information. As a comparison, the original Transformer-based streamflow forecasting model does not consider such periodic information explicitly. To show the performance of our TPT model, two datasets including the Catchment Attributes and Meteorology for Large-sample Studies in Australia (CAMELS-AUS) and a dataset from the Tangnaihai Hydrological Station located in Qinghai Province of China are employed in this paper. Our TPT model outperforms the benchmark Transformer model significantly, e.g., for Nash–Sutcliffe efficiency, the TPT model improves over the original Transformer-based model in 45.9% and furthermore its NSE achieves 0.9108 in Tangnaihai by pretraining in 20 selected basins in CAMELS-AUS. For monthly streamflow forecasting, the TPT model is a good choice.</div></div>","PeriodicalId":362,"journal":{"name":"Journal of Hydrology","volume":"660 ","pages":"Article 133308"},"PeriodicalIF":5.9000,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Hydrology","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022169425006468","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, CIVIL","Score":null,"Total":0}
引用次数: 0

Abstract

Monthly streamflow forecasting is important for water resources planning and management in hydrology. In recent years, deep learning based data-driven approaches have received significant attention, especially the Long Short-Term Memory (LSTM) and the Transformer. Among the above two sorts of models for such a task, hardly any model considers the periodic information from the same month of different years directly. This periodic information is important for monthly streamflow forecasting and we propose a periodic attention mechanism to explore it in this paper. Specifically, we propose a novel Temporal-Periodic Transformer (TPT) model, which has temporal-periodic attention modules exploring the temporal information and the periodic information. As a comparison, the original Transformer-based streamflow forecasting model does not consider such periodic information explicitly. To show the performance of our TPT model, two datasets including the Catchment Attributes and Meteorology for Large-sample Studies in Australia (CAMELS-AUS) and a dataset from the Tangnaihai Hydrological Station located in Qinghai Province of China are employed in this paper. Our TPT model outperforms the benchmark Transformer model significantly, e.g., for Nash–Sutcliffe efficiency, the TPT model improves over the original Transformer-based model in 45.9% and furthermore its NSE achieves 0.9108 in Tangnaihai by pretraining in 20 selected basins in CAMELS-AUS. For monthly streamflow forecasting, the TPT model is a good choice.
用时间周期变压器进行月流量预测
月流量预报对水文学水资源规划和管理具有重要意义。近年来,基于数据驱动的深度学习方法受到了广泛关注,特别是长短期记忆(LSTM)和Transformer。在上述两类模型中,几乎没有模型直接考虑不同年份同月的周期性信息。这种周期性信息对月度流量预测很重要,本文提出了一种周期性关注机制来探索它。具体而言,我们提出了一种新的时间周期转换器(TPT)模型,该模型具有探索时间信息和周期信息的时间周期关注模块。相比之下,原始的基于变压器的流量预测模型没有明确考虑这种周期性信息。为了验证TPT模型的有效性,本文使用了两个数据集,包括澳大利亚流域属性和气象大样本研究(CAMELS-AUS)和中国青海省唐乃海水文站的数据集。我们的TPT模型明显优于基准Transformer模型,例如,在Nash-Sutcliffe效率方面,TPT模型比原来基于Transformer的模型提高了45.9%,并且在CAMELS-AUS中选择20个流域进行预训练,其NSE在唐奈海达到0.9108。对于月流量预报,TPT模型是一个很好的选择。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Hydrology
Journal of Hydrology 地学-地球科学综合
CiteScore
11.00
自引率
12.50%
发文量
1309
审稿时长
7.5 months
期刊介绍: The Journal of Hydrology publishes original research papers and comprehensive reviews in all the subfields of the hydrological sciences including water based management and policy issues that impact on economics and society. These comprise, but are not limited to the physical, chemical, biogeochemical, stochastic and systems aspects of surface and groundwater hydrology, hydrometeorology and hydrogeology. Relevant topics incorporating the insights and methodologies of disciplines such as climatology, water resource systems, hydraulics, agrohydrology, geomorphology, soil science, instrumentation and remote sensing, civil and environmental engineering are included. Social science perspectives on hydrological problems such as resource and ecological economics, environmental sociology, psychology and behavioural science, management and policy analysis are also invited. Multi-and interdisciplinary analyses of hydrological problems are within scope. The science published in the Journal of Hydrology is relevant to catchment scales rather than exclusively to a local scale or site.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信