Near-real-time wildfire detection approach with Himawari-8/9 geostationary satellite data integrating multi-scale spatial–temporal feature

IF 7.6 Q1 REMOTE SENSING
Lizhi Zhang , Qiang Zhang , Qianqian Yang , Linwei Yue , Jiang He , Xianyu Jin , Qiangqiang Yuan
{"title":"Near-real-time wildfire detection approach with Himawari-8/9 geostationary satellite data integrating multi-scale spatial–temporal feature","authors":"Lizhi Zhang ,&nbsp;Qiang Zhang ,&nbsp;Qianqian Yang ,&nbsp;Linwei Yue ,&nbsp;Jiang He ,&nbsp;Xianyu Jin ,&nbsp;Qiangqiang Yuan","doi":"10.1016/j.jag.2025.104416","DOIUrl":null,"url":null,"abstract":"<div><div>Wildfires pose a great threat to the ecological environment and human safety. Therefore, rapid and accurate detection of wildfires holds significant importance. However, existing wildfire detection methods neglect the full integration of spatial–temporal relationships across different scales, and thus suffer from issues of low robustness and accuracy in varying wildfire scenes. To address this, we propose a deep learning model for near-real-time wildfire detection, where the core idea is to integrate multi-scale spatial–temporal features (MSSTF) to efficiently capture the dynamics of wildfires. Specifically, we design a multi-kernel attention-based convolution (MKAC) module for extracting spatial features representing the differences between fire and non-fire pixels within multi-scale receptive fields. Moreover, a long short-term Transformer (LSTT) module is used to capture the temporal differences from the image sequences with different window lengths. The two modules are combined into multiple streams to integrate the multi-scale spatial–temporal features, and the multi-stream features are then fused to generate the fire classification map. Extensive experiments on various fire scenes show that the proposed method is superior to JAXA Wildfire products and representative deep learning models, achieving the best accuracy scores (i.e., average fire accuracy (FA): 88.25%, average false alarm rate (FAR): 20.82%). The results also show that the method is sensitive to early-stage fire events and can be applied in the task of near-real-time wildfire detection with 10-minute Himawari-8/9 satellite data. The data and codes used in the study are detailed in: <span><span>https://github.com/eagle-void/MSSTF</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"137 ","pages":"Article 104416"},"PeriodicalIF":7.6000,"publicationDate":"2025-02-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843225000639","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0

Abstract

Wildfires pose a great threat to the ecological environment and human safety. Therefore, rapid and accurate detection of wildfires holds significant importance. However, existing wildfire detection methods neglect the full integration of spatial–temporal relationships across different scales, and thus suffer from issues of low robustness and accuracy in varying wildfire scenes. To address this, we propose a deep learning model for near-real-time wildfire detection, where the core idea is to integrate multi-scale spatial–temporal features (MSSTF) to efficiently capture the dynamics of wildfires. Specifically, we design a multi-kernel attention-based convolution (MKAC) module for extracting spatial features representing the differences between fire and non-fire pixels within multi-scale receptive fields. Moreover, a long short-term Transformer (LSTT) module is used to capture the temporal differences from the image sequences with different window lengths. The two modules are combined into multiple streams to integrate the multi-scale spatial–temporal features, and the multi-stream features are then fused to generate the fire classification map. Extensive experiments on various fire scenes show that the proposed method is superior to JAXA Wildfire products and representative deep learning models, achieving the best accuracy scores (i.e., average fire accuracy (FA): 88.25%, average false alarm rate (FAR): 20.82%). The results also show that the method is sensitive to early-stage fire events and can be applied in the task of near-real-time wildfire detection with 10-minute Himawari-8/9 satellite data. The data and codes used in the study are detailed in: https://github.com/eagle-void/MSSTF.
求助全文
约1分钟内获得全文 求助全文
来源期刊
International journal of applied earth observation and geoinformation : ITC journal
International journal of applied earth observation and geoinformation : ITC journal Global and Planetary Change, Management, Monitoring, Policy and Law, Earth-Surface Processes, Computers in Earth Sciences
CiteScore
12.00
自引率
0.00%
发文量
0
审稿时长
77 days
期刊介绍: The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信