A three-dimensional force estimation method for the cable-driven soft robot based on monocular images

Xiaohan Zhu, Ran Bu, Zhen Li, Fan Xu, Hesheng Wang
{"title":"A three-dimensional force estimation method for the cable-driven soft robot based on monocular images","authors":"Xiaohan Zhu, Ran Bu, Zhen Li, Fan Xu, Hesheng Wang","doi":"arxiv-2409.08033","DOIUrl":null,"url":null,"abstract":"Soft manipulators are known for their superiority in coping with\nhigh-safety-demanding interaction tasks, e.g., robot-assisted surgeries,\nelderly caring, etc. Yet the challenges residing in real-time contact feedback\nhave hindered further applications in precise manipulation. This paper proposes\nan end-to-end network to estimate the 3D contact force of the soft robot, with\nthe aim of enhancing its capabilities in interactive tasks. The presented\nmethod features directly utilizing monocular images fused with multidimensional\nactuation information as the network inputs. This approach simplifies the\npreprocessing of raw data compared to related studies that utilize 3D shape\ninformation for network inputs, consequently reducing configuration\nreconstruction errors. The unified feature representation module is devised to\nelevate low-dimensional features from the system's actuation signals to the\nsame level as image features, facilitating smoother integration of multimodal\ninformation. The proposed method has been experimentally validated in the soft\nrobot testbed, achieving satisfying accuracy in 3D force estimation (with a\nmean relative error of 0.84% compared to the best-reported result of 2.2% in\nthe related works).","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.08033","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Soft manipulators are known for their superiority in coping with high-safety-demanding interaction tasks, e.g., robot-assisted surgeries, elderly caring, etc. Yet the challenges residing in real-time contact feedback have hindered further applications in precise manipulation. This paper proposes an end-to-end network to estimate the 3D contact force of the soft robot, with the aim of enhancing its capabilities in interactive tasks. The presented method features directly utilizing monocular images fused with multidimensional actuation information as the network inputs. This approach simplifies the preprocessing of raw data compared to related studies that utilize 3D shape information for network inputs, consequently reducing configuration reconstruction errors. The unified feature representation module is devised to elevate low-dimensional features from the system's actuation signals to the same level as image features, facilitating smoother integration of multimodal information. The proposed method has been experimentally validated in the soft robot testbed, achieving satisfying accuracy in 3D force estimation (with a mean relative error of 0.84% compared to the best-reported result of 2.2% in the related works).
基于单目图像的缆索驱动软体机器人三维力估算方法
众所周知,软机械手在应对安全性要求较高的交互任务(如机器人辅助手术、老年人护理等)方面具有优势。然而,实时接触反馈所带来的挑战阻碍了其在精确操控领域的进一步应用。本文提出了一种端到端网络来估算软体机器人的三维接触力,旨在增强其在交互任务中的能力。该方法的特点是直接利用单目图像融合多维动作信息作为网络输入。与利用三维形状信息作为网络输入的相关研究相比,这种方法简化了原始数据的预处理,从而减少了配置重建误差。统一特征表示模块的设计目的是将系统执行信号中的低维特征提升到与图像特征相同的水平,从而促进多模态信息的平滑整合。所提出的方法已在软机器人测试平台上进行了实验验证,在三维力估算方面达到了令人满意的精度(平均相对误差为 0.84%,而相关著作中报告的最佳结果为 2.2%)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信