{"title":"MOA-YOLO:一种精确、实时、轻量级的基于yolov10的深海鱼类检测算法","authors":"Zhenyu Hu;Qi Chen","doi":"10.1109/JSEN.2025.3574723","DOIUrl":null,"url":null,"abstract":"The use of autonomous underwater vehicles (AUVs) is an efficient mean to detect deep-sea fishes due to their great ability to complete undersea mission autonomously. However, research on deep-sea fishes suffers from dynamic and small objects in degraded environments, making it difficult to recognize deep sea. In this article, an accurate, real-time, and lightweight deep-sea fish detection algorithm is proposed based on YOLOv10, named MOA-YOLO. First, an underwater image enhancement algorithm based on adaptive standardization and normalization networks is adopted to correct the color and contrast of deep-sea images. Then, the multihead latent attention (MLA) mechanism is added to the front of each detection head of MOA-YOLO to extract more object features, enhancing the discrimination between objects and surroundings. Next, an optimized normalized Wasserstein distance (ONWD) loss function is introduced to replace the complete intersection over union (CIOU) loss function, improving the ability to detect small objects. Finally, CBS and C2f modules in the backbone and neck are replaced with alterable kernel convolution (AKConv) and AKC2f modules, respectively, balancing the detection speed and accuracy. The comparative experiments are operated on both the desktop computer and Nvidia Jetson Xavier NX. Compared with mainstream algorithms, the precision, recall, mean average precision (<inline-formula> <tex-math>${\\text {mAP}}_{{50}}$ </tex-math></inline-formula>), <inline-formula> <tex-math>${\\text {mAP}}_{\\text {50:95}}$ </tex-math></inline-formula>, and frames per second (FPS) of MOA-YOLO are improved by at least 3.1%, 13.7%, 8.0%, 6.9%, and 8.8%, respectively. The experimental results prove that MOA-YOLO achieves excellent performance on deep-sea fish detection with the AUV. The code is available at <uri>https://github.com/hu167/code</uri>","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 13","pages":"23933-23947"},"PeriodicalIF":4.3000,"publicationDate":"2025-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MOA-YOLO: An Accurate, Real-Time, and Lightweight YOLOv10-Based Algorithm for Deep-Sea Fish Detection\",\"authors\":\"Zhenyu Hu;Qi Chen\",\"doi\":\"10.1109/JSEN.2025.3574723\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The use of autonomous underwater vehicles (AUVs) is an efficient mean to detect deep-sea fishes due to their great ability to complete undersea mission autonomously. However, research on deep-sea fishes suffers from dynamic and small objects in degraded environments, making it difficult to recognize deep sea. In this article, an accurate, real-time, and lightweight deep-sea fish detection algorithm is proposed based on YOLOv10, named MOA-YOLO. First, an underwater image enhancement algorithm based on adaptive standardization and normalization networks is adopted to correct the color and contrast of deep-sea images. Then, the multihead latent attention (MLA) mechanism is added to the front of each detection head of MOA-YOLO to extract more object features, enhancing the discrimination between objects and surroundings. Next, an optimized normalized Wasserstein distance (ONWD) loss function is introduced to replace the complete intersection over union (CIOU) loss function, improving the ability to detect small objects. Finally, CBS and C2f modules in the backbone and neck are replaced with alterable kernel convolution (AKConv) and AKC2f modules, respectively, balancing the detection speed and accuracy. The comparative experiments are operated on both the desktop computer and Nvidia Jetson Xavier NX. Compared with mainstream algorithms, the precision, recall, mean average precision (<inline-formula> <tex-math>${\\\\text {mAP}}_{{50}}$ </tex-math></inline-formula>), <inline-formula> <tex-math>${\\\\text {mAP}}_{\\\\text {50:95}}$ </tex-math></inline-formula>, and frames per second (FPS) of MOA-YOLO are improved by at least 3.1%, 13.7%, 8.0%, 6.9%, and 8.8%, respectively. The experimental results prove that MOA-YOLO achieves excellent performance on deep-sea fish detection with the AUV. The code is available at <uri>https://github.com/hu167/code</uri>\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"25 13\",\"pages\":\"23933-23947\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2025-06-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11024104/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/11024104/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
MOA-YOLO: An Accurate, Real-Time, and Lightweight YOLOv10-Based Algorithm for Deep-Sea Fish Detection
The use of autonomous underwater vehicles (AUVs) is an efficient mean to detect deep-sea fishes due to their great ability to complete undersea mission autonomously. However, research on deep-sea fishes suffers from dynamic and small objects in degraded environments, making it difficult to recognize deep sea. In this article, an accurate, real-time, and lightweight deep-sea fish detection algorithm is proposed based on YOLOv10, named MOA-YOLO. First, an underwater image enhancement algorithm based on adaptive standardization and normalization networks is adopted to correct the color and contrast of deep-sea images. Then, the multihead latent attention (MLA) mechanism is added to the front of each detection head of MOA-YOLO to extract more object features, enhancing the discrimination between objects and surroundings. Next, an optimized normalized Wasserstein distance (ONWD) loss function is introduced to replace the complete intersection over union (CIOU) loss function, improving the ability to detect small objects. Finally, CBS and C2f modules in the backbone and neck are replaced with alterable kernel convolution (AKConv) and AKC2f modules, respectively, balancing the detection speed and accuracy. The comparative experiments are operated on both the desktop computer and Nvidia Jetson Xavier NX. Compared with mainstream algorithms, the precision, recall, mean average precision (${\text {mAP}}_{{50}}$ ), ${\text {mAP}}_{\text {50:95}}$ , and frames per second (FPS) of MOA-YOLO are improved by at least 3.1%, 13.7%, 8.0%, 6.9%, and 8.8%, respectively. The experimental results prove that MOA-YOLO achieves excellent performance on deep-sea fish detection with the AUV. The code is available at https://github.com/hu167/code
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice