CNN-Based Video Surveillance for Fire and Localization Detection

Sudharani, R. Shirwaikar, Lincy Mathews
{"title":"CNN-Based Video Surveillance for Fire and Localization Detection","authors":"Sudharani, R. Shirwaikar, Lincy Mathews","doi":"10.1109/CCIP57447.2022.10058625","DOIUrl":null,"url":null,"abstract":"Convolutional neural networks (CNNs) have shown success in picture classification and other computer vision issues. The use of CNN in the fire recognition technique will greatly enhance detection accuracy, resulting in fewer fire tragedies and societal and environmental consequences. However, as inference requires a lot of memory and computing power, a significant problem is implementing CNN-based fire sensing devices in an actual video network. We describe a new, energy-efficient, Fire detection, location, and semantic understanding using a computationally efficient CNN architecture scenario based on the Squeeze Net design. It applies compact convolutional kernels and avoids huge, completely linked layers to save computational load. Despite its minimal processing requirements, results of the experiment show that our suggested approach achieves accuracy = 99.7%, F1-score = 98.49%, Precision = 98.99%, and Recall = 98.00%. Furthermore, by considering the specific characteristics of the circumstance at hand as well as the variety of fire data, the study shows how the efficiency and accuracy of the fire detection model.","PeriodicalId":309964,"journal":{"name":"2022 Fourth International Conference on Cognitive Computing and Information Processing (CCIP)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Fourth International Conference on Cognitive Computing and Information Processing (CCIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCIP57447.2022.10058625","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Convolutional neural networks (CNNs) have shown success in picture classification and other computer vision issues. The use of CNN in the fire recognition technique will greatly enhance detection accuracy, resulting in fewer fire tragedies and societal and environmental consequences. However, as inference requires a lot of memory and computing power, a significant problem is implementing CNN-based fire sensing devices in an actual video network. We describe a new, energy-efficient, Fire detection, location, and semantic understanding using a computationally efficient CNN architecture scenario based on the Squeeze Net design. It applies compact convolutional kernels and avoids huge, completely linked layers to save computational load. Despite its minimal processing requirements, results of the experiment show that our suggested approach achieves accuracy = 99.7%, F1-score = 98.49%, Precision = 98.99%, and Recall = 98.00%. Furthermore, by considering the specific characteristics of the circumstance at hand as well as the variety of fire data, the study shows how the efficiency and accuracy of the fire detection model.
基于cnn的火灾视频监控与定位检测
卷积神经网络(cnn)在图像分类和其他计算机视觉问题上取得了成功。在火灾识别技术中使用CNN将大大提高检测精度,减少火灾悲剧和社会环境后果。然而,由于推理需要大量的内存和计算能力,在实际视频网络中实现基于cnn的火灾传感设备是一个重要的问题。我们使用基于挤网设计的计算效率高的CNN架构场景描述了一种新的、节能的火灾探测、定位和语义理解。它采用紧凑的卷积核,避免了巨大的,完全链接的层,以节省计算负载。尽管该方法的处理要求很小,但实验结果表明,该方法的准确率为99.7%,F1-score为98.49%,Precision为98.99%,Recall为98.00%。此外,考虑到现场具体情况的特点以及火灾数据的多样性,研究表明火灾探测模型的效率和准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信