{"title":"MM_Fast_RCNN_ResNet: Construction of Multimodal Faster RCNN Inception and ResNet V2 for Pedestrian Tracking and detection","authors":"Johnson Kolluri, Sandeep Kumar Dash, Ranjita Das","doi":"10.5750/ijme.v1i1.1381","DOIUrl":null,"url":null,"abstract":"Pedestrian identification and tracking is a crucial duty in smart building monitoring. The development of sensors has led to architects' focus on smart building design. The image distortions caused by numerous external environmental factors present a significant problem for pedestrian recognition in smart buildings. It is difficult for machine learning algorithms and other conventional filter-based image classification methods, such as histograms of oriented gradient filters, to function efficiently when dealing with many input photos of pedestrians. Deep learning algorithms are now performing substantially better when processing an enormous amount of image data. This article evaluates a novel multimodal classifier-based pedestrian identification method. The proposed method is Multimodal Faster RCNN Inception and ResNet V2 (MM Fast RCNN ResNet). The collected attributes address a tracking problem and establish the foundation for several object recognition tasks (novelty). Our method's neural network is regularized, and the feature representation is automatically adjusted to the detection assignment, resulting in high accuracy (superior to the proposed method). The proposed method is assessed using the PenFudan dataset and contemporary techniques regarding several factors. It is discovered that the recommended MM Fast RCNN ResNet obtains precision, recall, FPPI, FPPW, and average precision of 0.9057, 0.8629, 0.0898, and 0.0943.","PeriodicalId":50313,"journal":{"name":"International Journal of Maritime Engineering","volume":null,"pages":null},"PeriodicalIF":0.7000,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Maritime Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.5750/ijme.v1i1.1381","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, MARINE","Score":null,"Total":0}
引用次数: 0
Abstract
Pedestrian identification and tracking is a crucial duty in smart building monitoring. The development of sensors has led to architects' focus on smart building design. The image distortions caused by numerous external environmental factors present a significant problem for pedestrian recognition in smart buildings. It is difficult for machine learning algorithms and other conventional filter-based image classification methods, such as histograms of oriented gradient filters, to function efficiently when dealing with many input photos of pedestrians. Deep learning algorithms are now performing substantially better when processing an enormous amount of image data. This article evaluates a novel multimodal classifier-based pedestrian identification method. The proposed method is Multimodal Faster RCNN Inception and ResNet V2 (MM Fast RCNN ResNet). The collected attributes address a tracking problem and establish the foundation for several object recognition tasks (novelty). Our method's neural network is regularized, and the feature representation is automatically adjusted to the detection assignment, resulting in high accuracy (superior to the proposed method). The proposed method is assessed using the PenFudan dataset and contemporary techniques regarding several factors. It is discovered that the recommended MM Fast RCNN ResNet obtains precision, recall, FPPI, FPPW, and average precision of 0.9057, 0.8629, 0.0898, and 0.0943.
期刊介绍:
The International Journal of Maritime Engineering (IJME) provides a forum for the reporting and discussion on technical and scientific issues associated with the design and construction of commercial marine vessels . Contributions in the form of papers and notes, together with discussion on published papers are welcomed.