Kailuan Tang;Shaowu Tang;Chenghua Lu;Shijian Wu;Sicong Liu;Juan Yi;Jian S. Dai;Zheng Wang
{"title":"Learning Based Exteroception of Soft Underwater Manipulator With Soft Actuator Network","authors":"Kailuan Tang;Shaowu Tang;Chenghua Lu;Shijian Wu;Sicong Liu;Juan Yi;Jian S. Dai;Zheng Wang","doi":"10.1109/LRA.2024.3487512","DOIUrl":null,"url":null,"abstract":"Interactions with environmental objects can induce substantial alterations in both exteroceptive and proprioceptive signals. However, the deployment of exteroceptive sensors within underwater soft manipulators encounters numerous challenges and constraints, thereby imposing limitations on their perception capabilities. In this article, we present a novel learning-based exteroceptive approach that utilizes internal proprioceptive signals and harnesses the principles of soft actuator network (SAN). Deformation and vibration resulting from external collisions tend to propagate through the SANs in underwater soft manipulators and can be detected by proprioceptive sensors. We extract features from the sensor signals and develop a fully-connected neural network (FCNN)-based classifier to determine collision positions. We have constructed a training dataset and an independent validation dataset for the purpose of training and validating the classifier. The experimental results affirm that the proposed method can identify collision locations with an accuracy level of 97.11% using the independent validation dataset, which exhibits potential applications within the domain of underwater soft robotics perception and control.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"9 12","pages":"11082-11089"},"PeriodicalIF":4.6000,"publicationDate":"2024-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10737404/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Interactions with environmental objects can induce substantial alterations in both exteroceptive and proprioceptive signals. However, the deployment of exteroceptive sensors within underwater soft manipulators encounters numerous challenges and constraints, thereby imposing limitations on their perception capabilities. In this article, we present a novel learning-based exteroceptive approach that utilizes internal proprioceptive signals and harnesses the principles of soft actuator network (SAN). Deformation and vibration resulting from external collisions tend to propagate through the SANs in underwater soft manipulators and can be detected by proprioceptive sensors. We extract features from the sensor signals and develop a fully-connected neural network (FCNN)-based classifier to determine collision positions. We have constructed a training dataset and an independent validation dataset for the purpose of training and validating the classifier. The experimental results affirm that the proposed method can identify collision locations with an accuracy level of 97.11% using the independent validation dataset, which exhibits potential applications within the domain of underwater soft robotics perception and control.
与环境物体的相互作用会引起外部感觉和本体感觉信号的巨大变化。然而,在水下软机械手中部署外感知传感器会遇到许多挑战和限制,从而对其感知能力造成限制。在这篇文章中,我们提出了一种基于学习的新型外感知方法,它利用内部本体感觉信号和软致动器网络(SAN)原理。外部碰撞产生的变形和振动往往会通过水下软体机械手的 SAN 传播,并可被本体感觉传感器检测到。我们从传感器信号中提取特征,并开发了基于全连接神经网络(FCNN)的分类器来确定碰撞位置。我们构建了一个训练数据集和一个独立的验证数据集,用于训练和验证分类器。实验结果表明,利用独立验证数据集,所提出的方法能够以 97.11% 的准确率识别碰撞位置,在水下软机器人感知和控制领域具有潜在的应用前景。
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.