Hua Zhao, Chao Xu, Jiaxing Chen, Zhexian Zhang, Xiang Wang
{"title":"BGLE-YOLO: A Lightweight Model for Underwater Bio-Detection.","authors":"Hua Zhao, Chao Xu, Jiaxing Chen, Zhexian Zhang, Xiang Wang","doi":"10.3390/s25051595","DOIUrl":null,"url":null,"abstract":"<p><p>Due to low contrast, chromatic aberration, and generally small objects in underwater environments, a new underwater fish detection model, BGLE-YOLO, is proposed to investigate automated methods dedicated to accurately detecting underwater objects in images. The model has small parameters and low computational effort and is suitable for edge devices. First, an efficient multi-scale convolutional EMC module is introduced to enhance the backbone network and capture the dynamic changes in targets in the underwater environment. Secondly, a global and local feature fusion module for small targets (BIG) is integrated into the neck network to preserve more feature information, reduce error information in higher-level features, and increase the model's effectiveness in detecting small targets. Finally, to prevent the detection accuracy impact due to excessive lightweighting, the lightweight shared head (LSH) is constructed. The reparameterization technique further improves detection accuracy without additional parameters and computational cost. Experimental results of BGLE-YOLO on the underwater datasets DUO (Detection Underwater Objects) and RUOD (Real-World Underwater Object Detection) show that the model achieves the same accuracy as the benchmark model with an ultra-low computational cost of 6.2 GFLOPs and an ultra-low model parameter of 1.6 MB.</p>","PeriodicalId":21698,"journal":{"name":"Sensors","volume":"25 5","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11902696/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.3390/s25051595","RegionNum":3,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"CHEMISTRY, ANALYTICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Due to low contrast, chromatic aberration, and generally small objects in underwater environments, a new underwater fish detection model, BGLE-YOLO, is proposed to investigate automated methods dedicated to accurately detecting underwater objects in images. The model has small parameters and low computational effort and is suitable for edge devices. First, an efficient multi-scale convolutional EMC module is introduced to enhance the backbone network and capture the dynamic changes in targets in the underwater environment. Secondly, a global and local feature fusion module for small targets (BIG) is integrated into the neck network to preserve more feature information, reduce error information in higher-level features, and increase the model's effectiveness in detecting small targets. Finally, to prevent the detection accuracy impact due to excessive lightweighting, the lightweight shared head (LSH) is constructed. The reparameterization technique further improves detection accuracy without additional parameters and computational cost. Experimental results of BGLE-YOLO on the underwater datasets DUO (Detection Underwater Objects) and RUOD (Real-World Underwater Object Detection) show that the model achieves the same accuracy as the benchmark model with an ultra-low computational cost of 6.2 GFLOPs and an ultra-low model parameter of 1.6 MB.
期刊介绍:
Sensors (ISSN 1424-8220) provides an advanced forum for the science and technology of sensors and biosensors. It publishes reviews (including comprehensive reviews on the complete sensors products), regular research papers and short notes. Our aim is to encourage scientists to publish their experimental and theoretical results in as much detail as possible. There is no restriction on the length of the papers. The full experimental details must be provided so that the results can be reproduced.