Wenbin Yu, Yangsong Li, Cheng Fan, Daoyong Fu, Chengjun Zhang, Yadang Chen, Ming Qian, Jie Liu, Gaoping Liu
{"title":"利用空间相关特征提取和深度时空融合网络进行降水预报","authors":"Wenbin Yu, Yangsong Li, Cheng Fan, Daoyong Fu, Chengjun Zhang, Yadang Chen, Ming Qian, Jie Liu, Gaoping Liu","doi":"10.1007/s12145-024-01412-5","DOIUrl":null,"url":null,"abstract":"<p>Precipitation nowcasting is crucial for various applications. However, existing deep learning models for meteorological applications face challenges regarding training efficiency, generalization of spatial features, and capturing long-range spatial dependencies. In particular, convolutional neural networks struggle to describe the complete spatial dependencies in radar echo reflectivity image sequences, making it difficult to model spatial features effectively. Additionally, current approaches using Encoder-Decoder structures based on recurrent neural networks have limited success in capturing global spatial dependencies and trajectory motion features in radar echo reflectivity images, especially for medium to high-intensity precipitation nowcasting. This paper addresses these issues by proposing a feature extraction method based on spatial correlation (FESC) and an end-to-end deep spatio-temporal fusion network (DST-FN) for precipitation nowcasting. FESC divides regions based on spatial correlation features extracted from radar echo reflectivity image sequences, improving the model’s understanding and prediction ability of meteorological data. We also introduce a Spatial Attention Mechanism (SAM) module into the TrajGRU model for better performance by adding a new memory channel. The proposed DST-FN framework utilizes the features extracted by FESC and temporal information, overcoming the limitations of encoding-decoding structures in precipitation nowcasting. Our approach demonstrates improved efficiency and effectiveness in capturing complex spatio-temporal dynamics compared to existing deep learning models.</p>","PeriodicalId":49318,"journal":{"name":"Earth Science Informatics","volume":"32 1","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2024-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Precipitation nowcasting leveraging spatial correlation feature extraction and deep spatio-temporal fusion network\",\"authors\":\"Wenbin Yu, Yangsong Li, Cheng Fan, Daoyong Fu, Chengjun Zhang, Yadang Chen, Ming Qian, Jie Liu, Gaoping Liu\",\"doi\":\"10.1007/s12145-024-01412-5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Precipitation nowcasting is crucial for various applications. However, existing deep learning models for meteorological applications face challenges regarding training efficiency, generalization of spatial features, and capturing long-range spatial dependencies. In particular, convolutional neural networks struggle to describe the complete spatial dependencies in radar echo reflectivity image sequences, making it difficult to model spatial features effectively. Additionally, current approaches using Encoder-Decoder structures based on recurrent neural networks have limited success in capturing global spatial dependencies and trajectory motion features in radar echo reflectivity images, especially for medium to high-intensity precipitation nowcasting. This paper addresses these issues by proposing a feature extraction method based on spatial correlation (FESC) and an end-to-end deep spatio-temporal fusion network (DST-FN) for precipitation nowcasting. FESC divides regions based on spatial correlation features extracted from radar echo reflectivity image sequences, improving the model’s understanding and prediction ability of meteorological data. We also introduce a Spatial Attention Mechanism (SAM) module into the TrajGRU model for better performance by adding a new memory channel. The proposed DST-FN framework utilizes the features extracted by FESC and temporal information, overcoming the limitations of encoding-decoding structures in precipitation nowcasting. Our approach demonstrates improved efficiency and effectiveness in capturing complex spatio-temporal dynamics compared to existing deep learning models.</p>\",\"PeriodicalId\":49318,\"journal\":{\"name\":\"Earth Science Informatics\",\"volume\":\"32 1\",\"pages\":\"\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Earth Science Informatics\",\"FirstCategoryId\":\"89\",\"ListUrlMain\":\"https://doi.org/10.1007/s12145-024-01412-5\",\"RegionNum\":4,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Earth Science Informatics","FirstCategoryId":"89","ListUrlMain":"https://doi.org/10.1007/s12145-024-01412-5","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Precipitation nowcasting leveraging spatial correlation feature extraction and deep spatio-temporal fusion network
Precipitation nowcasting is crucial for various applications. However, existing deep learning models for meteorological applications face challenges regarding training efficiency, generalization of spatial features, and capturing long-range spatial dependencies. In particular, convolutional neural networks struggle to describe the complete spatial dependencies in radar echo reflectivity image sequences, making it difficult to model spatial features effectively. Additionally, current approaches using Encoder-Decoder structures based on recurrent neural networks have limited success in capturing global spatial dependencies and trajectory motion features in radar echo reflectivity images, especially for medium to high-intensity precipitation nowcasting. This paper addresses these issues by proposing a feature extraction method based on spatial correlation (FESC) and an end-to-end deep spatio-temporal fusion network (DST-FN) for precipitation nowcasting. FESC divides regions based on spatial correlation features extracted from radar echo reflectivity image sequences, improving the model’s understanding and prediction ability of meteorological data. We also introduce a Spatial Attention Mechanism (SAM) module into the TrajGRU model for better performance by adding a new memory channel. The proposed DST-FN framework utilizes the features extracted by FESC and temporal information, overcoming the limitations of encoding-decoding structures in precipitation nowcasting. Our approach demonstrates improved efficiency and effectiveness in capturing complex spatio-temporal dynamics compared to existing deep learning models.
期刊介绍:
The Earth Science Informatics [ESIN] journal aims at rapid publication of high-quality, current, cutting-edge, and provocative scientific work in the area of Earth Science Informatics as it relates to Earth systems science and space science. This includes articles on the application of formal and computational methods, computational Earth science, spatial and temporal analyses, and all aspects of computer applications to the acquisition, storage, processing, interchange, and visualization of data and information about the materials, properties, processes, features, and phenomena that occur at all scales and locations in the Earth system’s five components (atmosphere, hydrosphere, geosphere, biosphere, cryosphere) and in space (see "About this journal" for more detail). The quarterly journal publishes research, methodology, and software articles, as well as editorials, comments, and book and software reviews. Review articles of relevant findings, topics, and methodologies are also considered.