{"title":"大自然启发的消防员助手由无人机(UAV)数据","authors":"Seyed Muhammad Hossein Mousavi, Atiye Ilanloo","doi":"10.5267/j.jfs.2023.1.004","DOIUrl":null,"url":null,"abstract":"One of the most hazardous phenomena in forests is wildfire or bush fire and early detection of massive damage prevention is vital. Employing Unmanned Aerial Vehicles (UAV) as a visual and extinguisher tool in order to prevent this tragedy which brings fatal effects on humans and wildlife has high importance. Additionally, using aerial imagery could assist firefighters to recognize fire intensity and localize and route the fire in the forest which shrinks down casualties of firefighters. All these benefits and more is just possible by employing cheap UAVs. The proposed research uses nature-inspired image processing techniques in order to segment and classify fire in color and thermal images. Multiple nature-inspired and traditional computer vision techniques, including Chicken Swarm Algorithm (CSA) intensity adjustment (contrast enhancement), Denoising Convolutional Neural Network (DnCNN), Local Phase Quantization (LPQ) feature extraction, Bees Image Segmentation, Biogeography-Based Optimization (BBO) feature selection, Firefly Algorithm (FA) classification and more are employed to achieve high classification and segmentation accuracy. The system evaluates nine performance metrics including, F-Score, Accuracy, and Jaccard for the segmentation stage and four performance metrics for the classification stage. All experiments are conducted on the two most recent UAV fire datasets of FLAME (2021) and DeepFire (2022). Additionally, fire intensity, fire direction, and fire geometrical calculation are calculated which assists firefighters even more. As smoke shows the location of the fire, a smoke detection workflow is proposed, too. Proposed system Compared with traditional and novel methods for segmentation and classification leading to satisfactory and promising results for almost all metrics. The trained model of this system could be used in most of the current rescue UAVs in real-time applications. For the FLAME dataset (color data), segmentation precision is 95.57 % and classification accuracy is 91.33 %. Also, For the DeepFire dataset segmentation precision is 91.74 % and classification accuracy is 96.88 %.","PeriodicalId":150615,"journal":{"name":"Journal of Future Sustainability","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Nature inspired firefighter assistant by unmanned aerial vehicle (UAV) data\",\"authors\":\"Seyed Muhammad Hossein Mousavi, Atiye Ilanloo\",\"doi\":\"10.5267/j.jfs.2023.1.004\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"One of the most hazardous phenomena in forests is wildfire or bush fire and early detection of massive damage prevention is vital. Employing Unmanned Aerial Vehicles (UAV) as a visual and extinguisher tool in order to prevent this tragedy which brings fatal effects on humans and wildlife has high importance. Additionally, using aerial imagery could assist firefighters to recognize fire intensity and localize and route the fire in the forest which shrinks down casualties of firefighters. All these benefits and more is just possible by employing cheap UAVs. The proposed research uses nature-inspired image processing techniques in order to segment and classify fire in color and thermal images. Multiple nature-inspired and traditional computer vision techniques, including Chicken Swarm Algorithm (CSA) intensity adjustment (contrast enhancement), Denoising Convolutional Neural Network (DnCNN), Local Phase Quantization (LPQ) feature extraction, Bees Image Segmentation, Biogeography-Based Optimization (BBO) feature selection, Firefly Algorithm (FA) classification and more are employed to achieve high classification and segmentation accuracy. The system evaluates nine performance metrics including, F-Score, Accuracy, and Jaccard for the segmentation stage and four performance metrics for the classification stage. All experiments are conducted on the two most recent UAV fire datasets of FLAME (2021) and DeepFire (2022). Additionally, fire intensity, fire direction, and fire geometrical calculation are calculated which assists firefighters even more. As smoke shows the location of the fire, a smoke detection workflow is proposed, too. Proposed system Compared with traditional and novel methods for segmentation and classification leading to satisfactory and promising results for almost all metrics. The trained model of this system could be used in most of the current rescue UAVs in real-time applications. For the FLAME dataset (color data), segmentation precision is 95.57 % and classification accuracy is 91.33 %. Also, For the DeepFire dataset segmentation precision is 91.74 % and classification accuracy is 96.88 %.\",\"PeriodicalId\":150615,\"journal\":{\"name\":\"Journal of Future Sustainability\",\"volume\":\"42 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Future Sustainability\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5267/j.jfs.2023.1.004\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Future Sustainability","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5267/j.jfs.2023.1.004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Nature inspired firefighter assistant by unmanned aerial vehicle (UAV) data
One of the most hazardous phenomena in forests is wildfire or bush fire and early detection of massive damage prevention is vital. Employing Unmanned Aerial Vehicles (UAV) as a visual and extinguisher tool in order to prevent this tragedy which brings fatal effects on humans and wildlife has high importance. Additionally, using aerial imagery could assist firefighters to recognize fire intensity and localize and route the fire in the forest which shrinks down casualties of firefighters. All these benefits and more is just possible by employing cheap UAVs. The proposed research uses nature-inspired image processing techniques in order to segment and classify fire in color and thermal images. Multiple nature-inspired and traditional computer vision techniques, including Chicken Swarm Algorithm (CSA) intensity adjustment (contrast enhancement), Denoising Convolutional Neural Network (DnCNN), Local Phase Quantization (LPQ) feature extraction, Bees Image Segmentation, Biogeography-Based Optimization (BBO) feature selection, Firefly Algorithm (FA) classification and more are employed to achieve high classification and segmentation accuracy. The system evaluates nine performance metrics including, F-Score, Accuracy, and Jaccard for the segmentation stage and four performance metrics for the classification stage. All experiments are conducted on the two most recent UAV fire datasets of FLAME (2021) and DeepFire (2022). Additionally, fire intensity, fire direction, and fire geometrical calculation are calculated which assists firefighters even more. As smoke shows the location of the fire, a smoke detection workflow is proposed, too. Proposed system Compared with traditional and novel methods for segmentation and classification leading to satisfactory and promising results for almost all metrics. The trained model of this system could be used in most of the current rescue UAVs in real-time applications. For the FLAME dataset (color data), segmentation precision is 95.57 % and classification accuracy is 91.33 %. Also, For the DeepFire dataset segmentation precision is 91.74 % and classification accuracy is 96.88 %.