{"title":"用于潜水辅助auv的低成本视觉通信","authors":"Arif Wibisono;Hyoung-Kyu Song;Byung Moo Lee","doi":"10.1109/JSEN.2025.3540073","DOIUrl":null,"url":null,"abstract":"The cognitive autonomous diving buddy (CADDY) is designed to serve as a companion and navigation system for human divers, leveraging an autonomous underwater vehicle (AUV). However, the implementation of this system faces challenges such as reliance on complex sensing devices and the use of wearable devices that limit flexibility. To address these issues, this study offers a cost-effective and simple solution based on computer vision (CV) technology. This approach eliminates the need for wearable devices using hand gesture recognition as a key feature. The system prototype was developed using a single camera, lightweight computing devices, and a simple algorithm. Laboratory testing demonstrated high recognition accuracy, ranging from 85% to 95%. The “Five” gesture achieved the highest accuracy at 95%, while the “Three” gesture had the lowest accuracy at 85%. This system offers not only better flexibility but also a simpler numeric gesture representation compared with conventional methods. However, the system faces challenges such as reduced accuracy for certain gestures. In real-world applications, challenges such as low visibility, varying lighting conditions, and turbulence effects can impact detection stability and gesture processing accuracy, especially if the system is further developed for multi-AUV collaboration. Nevertheless, this study makes a significant contribution by offering a cost-effective and flexible nonwearable approach as a foundation for developing reliable underwater communication systems.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 7","pages":"12167-12171"},"PeriodicalIF":4.3000,"publicationDate":"2025-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Low-Cost Visual-Based Communication for Diver-Assist AUVs\",\"authors\":\"Arif Wibisono;Hyoung-Kyu Song;Byung Moo Lee\",\"doi\":\"10.1109/JSEN.2025.3540073\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The cognitive autonomous diving buddy (CADDY) is designed to serve as a companion and navigation system for human divers, leveraging an autonomous underwater vehicle (AUV). However, the implementation of this system faces challenges such as reliance on complex sensing devices and the use of wearable devices that limit flexibility. To address these issues, this study offers a cost-effective and simple solution based on computer vision (CV) technology. This approach eliminates the need for wearable devices using hand gesture recognition as a key feature. The system prototype was developed using a single camera, lightweight computing devices, and a simple algorithm. Laboratory testing demonstrated high recognition accuracy, ranging from 85% to 95%. The “Five” gesture achieved the highest accuracy at 95%, while the “Three” gesture had the lowest accuracy at 85%. This system offers not only better flexibility but also a simpler numeric gesture representation compared with conventional methods. However, the system faces challenges such as reduced accuracy for certain gestures. In real-world applications, challenges such as low visibility, varying lighting conditions, and turbulence effects can impact detection stability and gesture processing accuracy, especially if the system is further developed for multi-AUV collaboration. Nevertheless, this study makes a significant contribution by offering a cost-effective and flexible nonwearable approach as a foundation for developing reliable underwater communication systems.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"25 7\",\"pages\":\"12167-12171\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2025-02-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10896462/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10896462/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Low-Cost Visual-Based Communication for Diver-Assist AUVs
The cognitive autonomous diving buddy (CADDY) is designed to serve as a companion and navigation system for human divers, leveraging an autonomous underwater vehicle (AUV). However, the implementation of this system faces challenges such as reliance on complex sensing devices and the use of wearable devices that limit flexibility. To address these issues, this study offers a cost-effective and simple solution based on computer vision (CV) technology. This approach eliminates the need for wearable devices using hand gesture recognition as a key feature. The system prototype was developed using a single camera, lightweight computing devices, and a simple algorithm. Laboratory testing demonstrated high recognition accuracy, ranging from 85% to 95%. The “Five” gesture achieved the highest accuracy at 95%, while the “Three” gesture had the lowest accuracy at 85%. This system offers not only better flexibility but also a simpler numeric gesture representation compared with conventional methods. However, the system faces challenges such as reduced accuracy for certain gestures. In real-world applications, challenges such as low visibility, varying lighting conditions, and turbulence effects can impact detection stability and gesture processing accuracy, especially if the system is further developed for multi-AUV collaboration. Nevertheless, this study makes a significant contribution by offering a cost-effective and flexible nonwearable approach as a foundation for developing reliable underwater communication systems.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice