{"title":"Enhancing drone-based fire detection with flame-specific attention and optimized feature fusion","authors":"Qiang Wang , Shiyu Guan , Shuchang Lyu , Guangliang Cheng","doi":"10.1016/j.jag.2025.104655","DOIUrl":null,"url":null,"abstract":"<div><div>Recent advancements in drone-based fire early warning technologies have significantly improved fire detection, particularly in remote and forested areas where drones are widely utilized. However, the constrained battery life and limited computational resources of drones present challenges for real-time fire detection. Existing methods primarily focus on fire target identification without considering the distinct color and thermal characteristics of flames, leading to suboptimal detection accuracy. To address these issues, we propose a Flame-Specific Attention (FSA) mechanism, which integrates heat conduction principles and flame shape features to enhance receptive field expansion while maintaining computational efficiency. Furthermore, the Neck of the model is optimized with a Focal Modulation module to improve feature fusion, and a variable multi-attention detection head is introduced to refine detection precision. Experimental results on our Comprehensive Fire Scene Dataset (containing 3,905 images) demonstrate that our model achieves a mean Average Precision ([email protected]) of 87.7%, surpassing both Vision Transformers (ViTs) and traditional CNN approaches. Compared to the YOLOv10 baseline, our approach improves precision by 5.7% while maintaining an inference speed of 182 FPS, enabling real-time deployment in edge-computing scenarios such as drone-based fire detection. Additionally, the model effectively detects small- and medium-sized flames, reducing false positives in challenging lighting conditions (e.g., sunset and urban illumination). These enhancements make our approach highly suitable for early fire warning applications in forest and urban environments.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"142 ","pages":"Article 104655"},"PeriodicalIF":7.6000,"publicationDate":"2025-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1569843225003024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 0
Abstract
Recent advancements in drone-based fire early warning technologies have significantly improved fire detection, particularly in remote and forested areas where drones are widely utilized. However, the constrained battery life and limited computational resources of drones present challenges for real-time fire detection. Existing methods primarily focus on fire target identification without considering the distinct color and thermal characteristics of flames, leading to suboptimal detection accuracy. To address these issues, we propose a Flame-Specific Attention (FSA) mechanism, which integrates heat conduction principles and flame shape features to enhance receptive field expansion while maintaining computational efficiency. Furthermore, the Neck of the model is optimized with a Focal Modulation module to improve feature fusion, and a variable multi-attention detection head is introduced to refine detection precision. Experimental results on our Comprehensive Fire Scene Dataset (containing 3,905 images) demonstrate that our model achieves a mean Average Precision ([email protected]) of 87.7%, surpassing both Vision Transformers (ViTs) and traditional CNN approaches. Compared to the YOLOv10 baseline, our approach improves precision by 5.7% while maintaining an inference speed of 182 FPS, enabling real-time deployment in edge-computing scenarios such as drone-based fire detection. Additionally, the model effectively detects small- and medium-sized flames, reducing false positives in challenging lighting conditions (e.g., sunset and urban illumination). These enhancements make our approach highly suitable for early fire warning applications in forest and urban environments.
期刊介绍:
The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.