Mesfer Al Duhayyim, Majdy M. Eltahir, Ola Abdelgney Omer Ali, Amani Abdulrahman Albraikan, Fahd N. Al-Wesabi, Anwer Mustafa Hilal, Manar Ahmed Hamza, Mohammed Rizwanullah
{"title":"Fusion-Based Deep Learning Model for Automated Forest Fire Detection","authors":"Mesfer Al Duhayyim, Majdy M. Eltahir, Ola Abdelgney Omer Ali, Amani Abdulrahman Albraikan, Fahd N. Al-Wesabi, Anwer Mustafa Hilal, Manar Ahmed Hamza, Mohammed Rizwanullah","doi":"10.32604/cmc.2023.024198","DOIUrl":null,"url":null,"abstract":"Earth resource and environmental monitoring are essential areas that can be used to investigate the environmental conditions and natural resources supporting sustainable policy development, regulatory measures, and their implementation elevating the environment. Large-scale forest fire is considered a major harmful hazard that affects climate change and life over the globe. Therefore, the early identification of forest fires using automated tools is essential to avoid the spread of fire to a large extent. Therefore, this paper focuses on the design of automated forest fire detection using a fusion-based deep learning (AFFD-FDL) model for environmental monitoring. The AFFD-FDL technique involves the design of an entropy-based fusion model for feature extraction. The combination of the handcrafted features using histogram of gradients (HOG) with deep features using SqueezeNet and Inception v3 models. Besides, an optimal extreme learning machine (ELM) based classifier is used to identify the existence of fire or not. In order to properly tune the parameters of the ELM model, the oppositional glowworm swarm optimization (OGSO) algorithm is employed and thereby improves the forest fire detection performance. A wide range of simulation analyses takes place on a benchmark dataset and the results are inspected under several aspects. The experimental results highlighted the betterment of the AFFD-FDL technique over the recent state of art techniques.","PeriodicalId":93535,"journal":{"name":"Computers, materials & continua","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers, materials & continua","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32604/cmc.2023.024198","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Earth resource and environmental monitoring are essential areas that can be used to investigate the environmental conditions and natural resources supporting sustainable policy development, regulatory measures, and their implementation elevating the environment. Large-scale forest fire is considered a major harmful hazard that affects climate change and life over the globe. Therefore, the early identification of forest fires using automated tools is essential to avoid the spread of fire to a large extent. Therefore, this paper focuses on the design of automated forest fire detection using a fusion-based deep learning (AFFD-FDL) model for environmental monitoring. The AFFD-FDL technique involves the design of an entropy-based fusion model for feature extraction. The combination of the handcrafted features using histogram of gradients (HOG) with deep features using SqueezeNet and Inception v3 models. Besides, an optimal extreme learning machine (ELM) based classifier is used to identify the existence of fire or not. In order to properly tune the parameters of the ELM model, the oppositional glowworm swarm optimization (OGSO) algorithm is employed and thereby improves the forest fire detection performance. A wide range of simulation analyses takes place on a benchmark dataset and the results are inspected under several aspects. The experimental results highlighted the betterment of the AFFD-FDL technique over the recent state of art techniques.