Finger flow: Reactive reach-while-grasp generation for robotic arms with multi-fingered hands

IF 5.2 2区 计算机科学 Q1 AUTOMATION & CONTROL SYSTEMS
Xuming Meng , Henry Maurenbrecher , Alin Albu-Schäffer , Manuel Keppler
{"title":"Finger flow: Reactive reach-while-grasp generation for robotic arms with multi-fingered hands","authors":"Xuming Meng ,&nbsp;Henry Maurenbrecher ,&nbsp;Alin Albu-Schäffer ,&nbsp;Manuel Keppler","doi":"10.1016/j.robot.2025.105222","DOIUrl":null,"url":null,"abstract":"<div><div>Humans effortlessly grasp both stationary and moving objects in one-shot motions, fluidly adapting to disturbances and automatically recovering from failed attempts. In contrast, robots with multi-fingered hands often rely on pre-planned, sequential “reach-then-grasp” strategies, which result in slow, unnatural motions and restrict the robot’s ability to react dynamically to changes in the object’s location. Moreover, open-loop execution oftentimes leads to grasp failures. To address these challenges, we introduce Finger Flow (FF), a reactive motion generator that uses the visual feedback from an onboard camera and position feedback from fingers and arms to robustly reach and grasp stationary and moving objects with unpredictable behavior. During the reaching, FF continuously guides the hand to avoid finger-object collisions and adjusts the hand’s reactive opening and closure based on its relative position to the object. This state-dependent behavior results in automatic recovery from failed grasp attempts. We also provide formal guarantees of convergence and collision avoidance for stationary spherical objects. We evaluate FF on the DLR humanoid robot <em>neoDavid</em>, equipped with a multi-fingered hand, and quantitatively assess its performance in a series of grasping experiments involving fast and reactive grasping of a stationary or unpredictable spatially moving object. Running in a closed loop at 3 kHz, FF achieves an 87 % grasp success rate on the stationary object placed at random positions over 130 attempts. Interactive and adversarial human-to-robot handover experiments further demonstrate the robustness and effectiveness of FF.</div></div>","PeriodicalId":49592,"journal":{"name":"Robotics and Autonomous Systems","volume":"195 ","pages":"Article 105222"},"PeriodicalIF":5.2000,"publicationDate":"2025-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Autonomous Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0921889025003197","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Humans effortlessly grasp both stationary and moving objects in one-shot motions, fluidly adapting to disturbances and automatically recovering from failed attempts. In contrast, robots with multi-fingered hands often rely on pre-planned, sequential “reach-then-grasp” strategies, which result in slow, unnatural motions and restrict the robot’s ability to react dynamically to changes in the object’s location. Moreover, open-loop execution oftentimes leads to grasp failures. To address these challenges, we introduce Finger Flow (FF), a reactive motion generator that uses the visual feedback from an onboard camera and position feedback from fingers and arms to robustly reach and grasp stationary and moving objects with unpredictable behavior. During the reaching, FF continuously guides the hand to avoid finger-object collisions and adjusts the hand’s reactive opening and closure based on its relative position to the object. This state-dependent behavior results in automatic recovery from failed grasp attempts. We also provide formal guarantees of convergence and collision avoidance for stationary spherical objects. We evaluate FF on the DLR humanoid robot neoDavid, equipped with a multi-fingered hand, and quantitatively assess its performance in a series of grasping experiments involving fast and reactive grasping of a stationary or unpredictable spatially moving object. Running in a closed loop at 3 kHz, FF achieves an 87 % grasp success rate on the stationary object placed at random positions over 130 attempts. Interactive and adversarial human-to-robot handover experiments further demonstrate the robustness and effectiveness of FF.
手指流动:多指手的机械臂在抓取时的反应性到达生成
人类在一次动作中毫不费力地抓住静止和移动的物体,流畅地适应干扰,并从失败的尝试中自动恢复。相比之下,拥有多指手的机器人通常依赖于预先规划的、顺序的“到达-然后抓住”策略,这导致机器人的动作缓慢、不自然,并限制了机器人对物体位置变化的动态反应能力。此外,开环执行通常会导致抓取失败。为了应对这些挑战,我们引入了手指流(FF),这是一种反应性运动发生器,它使用车载摄像头的视觉反馈和手指和手臂的位置反馈来强大地到达和抓住具有不可预测行为的静止和移动物体。在伸手过程中,FF不断引导手避免手指与物体碰撞,并根据手与物体的相对位置调整手的反应开闭。这种依赖于状态的行为导致从失败的抓取尝试中自动恢复。我们还提供了对静止球形物体收敛和避免碰撞的正式保证。我们在配备多指手的DLR类人机器人neoDavid上评估了FF,并定量评估了其在一系列抓取实验中的表现,包括快速和反应性抓取静止或不可预测的空间移动物体。在3 kHz的闭环中运行,FF在130次尝试中对放置在随机位置的静止物体实现了87%的抓取成功率。交互式和对抗性人机切换实验进一步证明了该算法的鲁棒性和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Robotics and Autonomous Systems
Robotics and Autonomous Systems 工程技术-机器人学
CiteScore
9.00
自引率
7.00%
发文量
164
审稿时长
4.5 months
期刊介绍: Robotics and Autonomous Systems will carry articles describing fundamental developments in the field of robotics, with special emphasis on autonomous systems. An important goal of this journal is to extend the state of the art in both symbolic and sensory based robot control and learning in the context of autonomous systems. Robotics and Autonomous Systems will carry articles on the theoretical, computational and experimental aspects of autonomous systems, or modules of such systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信