{"title":"UAV-StrawFire:一个可见光和红外数据集,用于实时秸秆火灾监测,具有深度学习和图像融合","authors":"Xikun Hu, Ya Jiang, Xiaoyan Xia, Chen Chen, Wenlin Liu, Pengcheng Wan, Kangcheng Bin, Ping Zhong","doi":"10.1016/j.jag.2025.104586","DOIUrl":null,"url":null,"abstract":"<div><div>Straw burning poses significant threats to local air quality and nearby public health by emitting harmful pollutants during specific seasons. Traditional satellite-based remote sensing techniques encounter difficulties in monitoring small-scale straw-burning events due to long revisit intervals and low spatial resolution. To address this challenge, unmanned aerial vehicles (UAVs) equipped with imaging sensors have emerged as a rapid and cost-effective solution for monitoring and detecting straw fires. This paper presents the UAV-StrawFire dataset, which comprises RGB images, thermal infrared images, and videos captured during controlled straw residue burning experiments in southern China using drones. The dataset is annotated and labeled to support the application of detection, segmentation, and tracking algorithms. This study addresses three key machine learning tasks using the dataset: (1) flame detection, achieved through a feature-based multi-modal image fusion model (FF-YOLOv5n) reaching a mAP50-95 of 0.5764; (2) flame segmentation, which delineates fire boundaries using the real-time lightweight BiSeNetV2 model, achieving a high mean Intersection over Union (mIoU) score exceeding 0.88; and (3) flame tracking, which monitors the real-time progression of straw burning with a precision of 0.9065 and a success rate of 0.6593 using the Aba-ViTrack algorithm, suitable for on-board processing on UAVs at 50 frames per second (FPS). These experiments provide efficient baseline models for UAV-based straw-burning monitoring with edge computing capabilities. The UAV-StrawFire dataset enables the detection and monitoring of flame regions with varying sizes, textures, and opacities, thereby supporting potential straw-burning control efforts. The dataset is publicly available on IEEE Dataport, offering a valuable resource for researchers in the remote sensing and machine learning communities to advance the development of effective straw-burning monitoring systems.</div></div>","PeriodicalId":73423,"journal":{"name":"International journal of applied earth observation and geoinformation : ITC journal","volume":"141 ","pages":"Article 104586"},"PeriodicalIF":7.6000,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"UAV-StrawFire: A visible and infrared dataset for real-time straw-fire monitoring with deep learning and image fusion\",\"authors\":\"Xikun Hu, Ya Jiang, Xiaoyan Xia, Chen Chen, Wenlin Liu, Pengcheng Wan, Kangcheng Bin, Ping Zhong\",\"doi\":\"10.1016/j.jag.2025.104586\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Straw burning poses significant threats to local air quality and nearby public health by emitting harmful pollutants during specific seasons. Traditional satellite-based remote sensing techniques encounter difficulties in monitoring small-scale straw-burning events due to long revisit intervals and low spatial resolution. To address this challenge, unmanned aerial vehicles (UAVs) equipped with imaging sensors have emerged as a rapid and cost-effective solution for monitoring and detecting straw fires. This paper presents the UAV-StrawFire dataset, which comprises RGB images, thermal infrared images, and videos captured during controlled straw residue burning experiments in southern China using drones. The dataset is annotated and labeled to support the application of detection, segmentation, and tracking algorithms. This study addresses three key machine learning tasks using the dataset: (1) flame detection, achieved through a feature-based multi-modal image fusion model (FF-YOLOv5n) reaching a mAP50-95 of 0.5764; (2) flame segmentation, which delineates fire boundaries using the real-time lightweight BiSeNetV2 model, achieving a high mean Intersection over Union (mIoU) score exceeding 0.88; and (3) flame tracking, which monitors the real-time progression of straw burning with a precision of 0.9065 and a success rate of 0.6593 using the Aba-ViTrack algorithm, suitable for on-board processing on UAVs at 50 frames per second (FPS). These experiments provide efficient baseline models for UAV-based straw-burning monitoring with edge computing capabilities. The UAV-StrawFire dataset enables the detection and monitoring of flame regions with varying sizes, textures, and opacities, thereby supporting potential straw-burning control efforts. The dataset is publicly available on IEEE Dataport, offering a valuable resource for researchers in the remote sensing and machine learning communities to advance the development of effective straw-burning monitoring systems.</div></div>\",\"PeriodicalId\":73423,\"journal\":{\"name\":\"International journal of applied earth observation and geoinformation : ITC journal\",\"volume\":\"141 \",\"pages\":\"Article 104586\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2025-05-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of applied earth observation and geoinformation : ITC journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S156984322500233X\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"REMOTE SENSING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of applied earth observation and geoinformation : ITC journal","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S156984322500233X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"REMOTE SENSING","Score":null,"Total":0}
UAV-StrawFire: A visible and infrared dataset for real-time straw-fire monitoring with deep learning and image fusion
Straw burning poses significant threats to local air quality and nearby public health by emitting harmful pollutants during specific seasons. Traditional satellite-based remote sensing techniques encounter difficulties in monitoring small-scale straw-burning events due to long revisit intervals and low spatial resolution. To address this challenge, unmanned aerial vehicles (UAVs) equipped with imaging sensors have emerged as a rapid and cost-effective solution for monitoring and detecting straw fires. This paper presents the UAV-StrawFire dataset, which comprises RGB images, thermal infrared images, and videos captured during controlled straw residue burning experiments in southern China using drones. The dataset is annotated and labeled to support the application of detection, segmentation, and tracking algorithms. This study addresses three key machine learning tasks using the dataset: (1) flame detection, achieved through a feature-based multi-modal image fusion model (FF-YOLOv5n) reaching a mAP50-95 of 0.5764; (2) flame segmentation, which delineates fire boundaries using the real-time lightweight BiSeNetV2 model, achieving a high mean Intersection over Union (mIoU) score exceeding 0.88; and (3) flame tracking, which monitors the real-time progression of straw burning with a precision of 0.9065 and a success rate of 0.6593 using the Aba-ViTrack algorithm, suitable for on-board processing on UAVs at 50 frames per second (FPS). These experiments provide efficient baseline models for UAV-based straw-burning monitoring with edge computing capabilities. The UAV-StrawFire dataset enables the detection and monitoring of flame regions with varying sizes, textures, and opacities, thereby supporting potential straw-burning control efforts. The dataset is publicly available on IEEE Dataport, offering a valuable resource for researchers in the remote sensing and machine learning communities to advance the development of effective straw-burning monitoring systems.
期刊介绍:
The International Journal of Applied Earth Observation and Geoinformation publishes original papers that utilize earth observation data for natural resource and environmental inventory and management. These data primarily originate from remote sensing platforms, including satellites and aircraft, supplemented by surface and subsurface measurements. Addressing natural resources such as forests, agricultural land, soils, and water, as well as environmental concerns like biodiversity, land degradation, and hazards, the journal explores conceptual and data-driven approaches. It covers geoinformation themes like capturing, databasing, visualization, interpretation, data quality, and spatial uncertainty.