Tiandong Zhang;Rui Wang;Qiyuan Cao;Shaowei Cui;Gang Zheng;Shuo Wang
{"title":"FlowSight: Vision-Based Artificial Lateral Line Sensor for Water Flow Perception","authors":"Tiandong Zhang;Rui Wang;Qiyuan Cao;Shaowei Cui;Gang Zheng;Shuo Wang","doi":"10.1109/TRO.2025.3567551","DOIUrl":null,"url":null,"abstract":"This article presents a novel vision-based artificial lateral line (ALL) sensor, FlowSight, enhancing the perception capabilities of underwater robots. Through an autonomous vision system, FlowSight allows for simultaneous sensing the speed and direction of local water flow without relying on external auxiliary equipment. Inspired by the lateral line neuromast of fish, a flexible bionic tentacle is designed to sense water flow. Deformation and motion characteristics of the tentacle are modeled and analyzed using bidirectional fluid-structure interaction (FSI) simulation. Upon contact with water flow, the tentacle converts water flow information into elastic deformation information, which is captured and processed into an image sequence by the autonomous vision system. Subsequently, a water flow perception method based on deep neural networks is proposed to estimate the flow speed and direction from the captured image sequence. The perception network is trained and tested using data collected from practical experiments conducted in a controllable swim tunnel. Finally, the FlowSight sensor is integrated into the bionic underwater robot RoboDact, and a closed-loop motion control experiment based on water flow perception is conducted. Experiments conducted in the swim tunnel and water pool demonstrate the feasibility and effectiveness of FlowSight sensor and the water flow perception method.","PeriodicalId":50388,"journal":{"name":"IEEE Transactions on Robotics","volume":"41 ","pages":"3260-3277"},"PeriodicalIF":9.4000,"publicationDate":"2025-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Robotics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10989247/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
This article presents a novel vision-based artificial lateral line (ALL) sensor, FlowSight, enhancing the perception capabilities of underwater robots. Through an autonomous vision system, FlowSight allows for simultaneous sensing the speed and direction of local water flow without relying on external auxiliary equipment. Inspired by the lateral line neuromast of fish, a flexible bionic tentacle is designed to sense water flow. Deformation and motion characteristics of the tentacle are modeled and analyzed using bidirectional fluid-structure interaction (FSI) simulation. Upon contact with water flow, the tentacle converts water flow information into elastic deformation information, which is captured and processed into an image sequence by the autonomous vision system. Subsequently, a water flow perception method based on deep neural networks is proposed to estimate the flow speed and direction from the captured image sequence. The perception network is trained and tested using data collected from practical experiments conducted in a controllable swim tunnel. Finally, the FlowSight sensor is integrated into the bionic underwater robot RoboDact, and a closed-loop motion control experiment based on water flow perception is conducted. Experiments conducted in the swim tunnel and water pool demonstrate the feasibility and effectiveness of FlowSight sensor and the water flow perception method.
期刊介绍:
The IEEE Transactions on Robotics (T-RO) is dedicated to publishing fundamental papers covering all facets of robotics, drawing on interdisciplinary approaches from computer science, control systems, electrical engineering, mathematics, mechanical engineering, and beyond. From industrial applications to service and personal assistants, surgical operations to space, underwater, and remote exploration, robots and intelligent machines play pivotal roles across various domains, including entertainment, safety, search and rescue, military applications, agriculture, and intelligent vehicles.
Special emphasis is placed on intelligent machines and systems designed for unstructured environments, where a significant portion of the environment remains unknown and beyond direct sensing or control.