{"title":"Spatiotemporal fusion convolutional neural network: tropical cyclone intensity estimation from multisource remote sensing images","authors":"Randi Fu, Haiyan Hu, Nan Wu, Zhening Liu, Wei Jin","doi":"10.1117/1.jrs.18.018501","DOIUrl":null,"url":null,"abstract":"Utilizing multisource remote sensing images to accurately estimate tropical cyclone (TC) intensity is crucial and challenging. Traditional approaches rely on a single image for intensity estimation and lack the capability to perceive dynamic spatiotemporal information. Meanwhile, many existing deep learning methods sample from a time series of fixed length and depend on computation-intensive 3D feature extraction modules, limiting the model’s flexibility and scalability. By organically linking the genesis and dissipation mechanisms of a TC with computer vision techniques, we introduce a spatiotemporal fusion convolutional neural network that integrates three distinct improvement approaches. First, an a priori aware nonparametric fusion module is introduced to effectively fuse key features from multisource remote sensing data. Second, we design a scale-aware contraction–expansion module. This module effectively captures detailed features of the TC by connecting information from different scales through a weighted and up-sampling method. Finally, we propose a 1D–2D conditional sampling training method that balances single-step regression (for short sequences) and latent-variable-based temporal modeling (for long sequences) to achieve flexible spatiotemporal feature perception, thereby avoiding the data scale constraint imposed by fixed sequence lengths. Through qualitative and quantitative experimental comparisons, the proposed spatiotemporal fusion convolutional neural network achieved a root-mean-square error of 8.89 kt, marking a 29.7% improvement over the advanced Dvorak technique, and its efficacy in actual TC case analyses indicates its practical viability and potential for broader applications.","PeriodicalId":1,"journal":{"name":"Accounts of Chemical Research","volume":null,"pages":null},"PeriodicalIF":16.4000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Accounts of Chemical Research","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1117/1.jrs.18.018501","RegionNum":1,"RegionCategory":"化学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Utilizing multisource remote sensing images to accurately estimate tropical cyclone (TC) intensity is crucial and challenging. Traditional approaches rely on a single image for intensity estimation and lack the capability to perceive dynamic spatiotemporal information. Meanwhile, many existing deep learning methods sample from a time series of fixed length and depend on computation-intensive 3D feature extraction modules, limiting the model’s flexibility and scalability. By organically linking the genesis and dissipation mechanisms of a TC with computer vision techniques, we introduce a spatiotemporal fusion convolutional neural network that integrates three distinct improvement approaches. First, an a priori aware nonparametric fusion module is introduced to effectively fuse key features from multisource remote sensing data. Second, we design a scale-aware contraction–expansion module. This module effectively captures detailed features of the TC by connecting information from different scales through a weighted and up-sampling method. Finally, we propose a 1D–2D conditional sampling training method that balances single-step regression (for short sequences) and latent-variable-based temporal modeling (for long sequences) to achieve flexible spatiotemporal feature perception, thereby avoiding the data scale constraint imposed by fixed sequence lengths. Through qualitative and quantitative experimental comparisons, the proposed spatiotemporal fusion convolutional neural network achieved a root-mean-square error of 8.89 kt, marking a 29.7% improvement over the advanced Dvorak technique, and its efficacy in actual TC case analyses indicates its practical viability and potential for broader applications.
期刊介绍:
Accounts of Chemical Research presents short, concise and critical articles offering easy-to-read overviews of basic research and applications in all areas of chemistry and biochemistry. These short reviews focus on research from the author’s own laboratory and are designed to teach the reader about a research project. In addition, Accounts of Chemical Research publishes commentaries that give an informed opinion on a current research problem. Special Issues online are devoted to a single topic of unusual activity and significance.
Accounts of Chemical Research replaces the traditional article abstract with an article "Conspectus." These entries synopsize the research affording the reader a closer look at the content and significance of an article. Through this provision of a more detailed description of the article contents, the Conspectus enhances the article's discoverability by search engines and the exposure for the research.