Learning an Image-Based Visual Servoing Controller for Object Grasping

IF 0.9 4区 计算机科学 Q4 ROBOTICS
Shuaijun Wang, Lining Sun, Mantian Li, Pengfei Wang, Fusheng Zha, Wei Guo, Qiang Li
{"title":"Learning an Image-Based Visual Servoing Controller for Object Grasping","authors":"Shuaijun Wang, Lining Sun, Mantian Li, Pengfei Wang, Fusheng Zha, Wei Guo, Qiang Li","doi":"10.1142/s0219843623500330","DOIUrl":null,"url":null,"abstract":"<p>Adaptive and cooperative control of arms and fingers for natural object reaching and grasping, without explicit 3D geometric pose information, is observed in humans. In this study, an image-based visual servoing controller, inspired by human grasping behavior, is proposed for an arm-gripper system. A large-scale dataset is constructed using Pybullet simulation, comprising paired images and arm-gripper control signals mimicking expert grasping behavior. Leveraging this dataset, a network is directly trained to derive a control policy that maps images to cooperative grasp control. Subsequently, the learned synergy grasping policy from the network is directly applied to a real robot with the same configuration. Experimental results demonstrate the effectiveness of the algorithm. Videos can be found at https://www.bilibili.com/video/BV1tg4y1b7Qe/.</p>","PeriodicalId":50319,"journal":{"name":"International Journal of Humanoid Robotics","volume":"51 1","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2024-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Humanoid Robotics","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1142/s0219843623500330","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Adaptive and cooperative control of arms and fingers for natural object reaching and grasping, without explicit 3D geometric pose information, is observed in humans. In this study, an image-based visual servoing controller, inspired by human grasping behavior, is proposed for an arm-gripper system. A large-scale dataset is constructed using Pybullet simulation, comprising paired images and arm-gripper control signals mimicking expert grasping behavior. Leveraging this dataset, a network is directly trained to derive a control policy that maps images to cooperative grasp control. Subsequently, the learned synergy grasping policy from the network is directly applied to a real robot with the same configuration. Experimental results demonstrate the effectiveness of the algorithm. Videos can be found at https://www.bilibili.com/video/BV1tg4y1b7Qe/.

学习基于图像的物体抓取视觉伺服控制器
在没有明确的三维几何姿势信息的情况下,可以观察到人类对手臂和手指的自适应协同控制,以实现自然物体的伸展和抓取。在本研究中,受人类抓握行为的启发,针对手臂抓取系统提出了一种基于图像的视觉伺服控制器。利用 Pybullet 仿真构建了一个大规模数据集,其中包括成对图像和模仿专家抓取行为的手臂抓取器控制信号。利用这个数据集,可以直接训练一个网络,推导出将图像映射到协同抓取控制的控制策略。随后,从网络中学到的协同抓取策略被直接应用到具有相同配置的真实机器人上。实验结果证明了该算法的有效性。视频请访问 https://www.bilibili.com/video/BV1tg4y1b7Qe/。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International Journal of Humanoid Robotics
International Journal of Humanoid Robotics 工程技术-机器人学
CiteScore
3.50
自引率
13.30%
发文量
29
审稿时长
6 months
期刊介绍: The International Journal of Humanoid Robotics (IJHR) covers all subjects on the mind and body of humanoid robots. It is dedicated to advancing new theories, new techniques, and new implementations contributing to the successful achievement of future robots which not only imitate human beings, but also serve human beings. While IJHR encourages the contribution of original papers which are solidly grounded on proven theories or experimental procedures, the journal also encourages the contribution of innovative papers which venture into the new, frontier areas in robotics. Such papers need not necessarily demonstrate, in the early stages of research and development, the full potential of new findings on a physical or virtual robot. IJHR welcomes original papers in the following categories: Research papers, which disseminate scientific findings contributing to solving technical issues underlying the development of humanoid robots, or biologically-inspired robots, having multiple functionality related to either physical capabilities (i.e. motion) or mental capabilities (i.e. intelligence) Review articles, which describe, in non-technical terms, the latest in basic theories, principles, and algorithmic solutions Short articles (e.g. feature articles and dialogues), which discuss the latest significant achievements and the future trends in robotics R&D Papers on curriculum development in humanoid robot education Book reviews.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信