用单目RGB摄像机演示机器人编程

IF 1.9 4区 计算机科学 Q3 ENGINEERING, INDUSTRIAL
Kaimeng Wang, Te Tang
{"title":"用单目RGB摄像机演示机器人编程","authors":"Kaimeng Wang, Te Tang","doi":"10.1108/ir-04-2022-0093","DOIUrl":null,"url":null,"abstract":"\nPurpose\nThis paper aims to present a new approach for robot programming by demonstration, which generates robot programs by tracking 6 dimensional (6D) pose of the demonstrator’s hand using a single red green blue (RGB) camera without requiring any additional sensors.\n\n\nDesign/methodology/approach\nThe proposed method learns robot grasps and trajectories directly from a single human demonstration by tracking the movements of both human hands and objects. To recover the 6D pose of an object from a single RGB image, a deep learning–based method is used to detect the keypoints of the object first and then solve a perspective-n-point problem. This method is first extended to estimate the 6D pose of the nonrigid hand by separating fingers into multiple rigid bones linked with hand joints. The accurate robot grasp can be generated according to the relative positions between hands and objects in the 2 dimensional space. Robot end-effector trajectories are generated from hand movements and then refined by objects’ start and end positions.\n\n\nFindings\nExperiments are conducted on a FANUC LR Mate 200iD robot to verify the proposed approach. The results show the feasibility of generating robot programs by observing human demonstration once using a single RGB camera.\n\n\nOriginality/value\nThe proposed approach provides an efficient and low-cost robot programming method with a single RGB camera. A new 6D hand pose estimation approach, which is used to generate robot grasps and trajectories, is developed.\n","PeriodicalId":54987,"journal":{"name":"Industrial Robot-The International Journal of Robotics Research and Application","volume":"31 1","pages":"234-245"},"PeriodicalIF":1.9000,"publicationDate":"2022-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Robot programming by demonstration with a monocular RGB camera\",\"authors\":\"Kaimeng Wang, Te Tang\",\"doi\":\"10.1108/ir-04-2022-0093\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\nPurpose\\nThis paper aims to present a new approach for robot programming by demonstration, which generates robot programs by tracking 6 dimensional (6D) pose of the demonstrator’s hand using a single red green blue (RGB) camera without requiring any additional sensors.\\n\\n\\nDesign/methodology/approach\\nThe proposed method learns robot grasps and trajectories directly from a single human demonstration by tracking the movements of both human hands and objects. To recover the 6D pose of an object from a single RGB image, a deep learning–based method is used to detect the keypoints of the object first and then solve a perspective-n-point problem. This method is first extended to estimate the 6D pose of the nonrigid hand by separating fingers into multiple rigid bones linked with hand joints. The accurate robot grasp can be generated according to the relative positions between hands and objects in the 2 dimensional space. Robot end-effector trajectories are generated from hand movements and then refined by objects’ start and end positions.\\n\\n\\nFindings\\nExperiments are conducted on a FANUC LR Mate 200iD robot to verify the proposed approach. The results show the feasibility of generating robot programs by observing human demonstration once using a single RGB camera.\\n\\n\\nOriginality/value\\nThe proposed approach provides an efficient and low-cost robot programming method with a single RGB camera. A new 6D hand pose estimation approach, which is used to generate robot grasps and trajectories, is developed.\\n\",\"PeriodicalId\":54987,\"journal\":{\"name\":\"Industrial Robot-The International Journal of Robotics Research and Application\",\"volume\":\"31 1\",\"pages\":\"234-245\"},\"PeriodicalIF\":1.9000,\"publicationDate\":\"2022-09-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Industrial Robot-The International Journal of Robotics Research and Application\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1108/ir-04-2022-0093\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, INDUSTRIAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Industrial Robot-The International Journal of Robotics Research and Application","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1108/ir-04-2022-0093","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 3

摘要

本文旨在提出一种新的机器人演示编程方法,该方法通过使用单个红绿蓝(RGB)相机跟踪演示者手部的6维(6D)姿势来生成机器人程序,而无需任何额外的传感器。设计/方法/方法所提出的方法通过跟踪人手和物体的运动,直接从单个人类演示中学习机器人的抓取和轨迹。为了从单个RGB图像中恢复物体的6D姿态,首先使用基于深度学习的方法检测物体的关键点,然后解决透视n点问题。首先将该方法扩展到非刚性手的6D姿态估计,将手指分离成多个与手关节相连的刚性骨骼。根据手与物体在二维空间中的相对位置,可以生成精确的机器人抓握。机器人末端执行器轨迹由手部运动生成,然后根据物体的起始和结束位置进行细化。在FANUC LR Mate 200iD机器人上进行了实验以验证所提出的方法。结果表明,利用单个RGB相机,通过观察人类演示一次,生成机器人程序是可行的。该方法提供了一种高效、低成本的单RGB相机机器人编程方法。提出了一种新的6D手部姿态估计方法,用于生成机器人抓取和轨迹。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Robot programming by demonstration with a monocular RGB camera
Purpose This paper aims to present a new approach for robot programming by demonstration, which generates robot programs by tracking 6 dimensional (6D) pose of the demonstrator’s hand using a single red green blue (RGB) camera without requiring any additional sensors. Design/methodology/approach The proposed method learns robot grasps and trajectories directly from a single human demonstration by tracking the movements of both human hands and objects. To recover the 6D pose of an object from a single RGB image, a deep learning–based method is used to detect the keypoints of the object first and then solve a perspective-n-point problem. This method is first extended to estimate the 6D pose of the nonrigid hand by separating fingers into multiple rigid bones linked with hand joints. The accurate robot grasp can be generated according to the relative positions between hands and objects in the 2 dimensional space. Robot end-effector trajectories are generated from hand movements and then refined by objects’ start and end positions. Findings Experiments are conducted on a FANUC LR Mate 200iD robot to verify the proposed approach. The results show the feasibility of generating robot programs by observing human demonstration once using a single RGB camera. Originality/value The proposed approach provides an efficient and low-cost robot programming method with a single RGB camera. A new 6D hand pose estimation approach, which is used to generate robot grasps and trajectories, is developed.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
4.50
自引率
16.70%
发文量
86
审稿时长
5.7 months
期刊介绍: Industrial Robot publishes peer reviewed research articles, technology reviews and specially commissioned case studies. Each issue includes high quality content covering all aspects of robotic technology, and reflecting the most interesting and strategically important research and development activities from around the world. The journal’s policy of not publishing work that has only been tested in simulation means that only the very best and most practical research articles are included. This ensures that the material that is published has real relevance and value for commercial manufacturing and research organizations. Industrial Robot''s coverage includes, but is not restricted to: Automatic assembly Flexible manufacturing Programming optimisation Simulation and offline programming Service robots Autonomous robots Swarm intelligence Humanoid robots Prosthetics and exoskeletons Machine intelligence Military robots Underwater and aerial robots Cooperative robots Flexible grippers and tactile sensing Robot vision Teleoperation Mobile robots Search and rescue robots Robot welding Collision avoidance Robotic machining Surgical robots Call for Papers 2020 AI for Autonomous Unmanned Systems Agricultural Robot Brain-Computer Interfaces for Human-Robot Interaction Cooperative Robots Robots for Environmental Monitoring Rehabilitation Robots Wearable Robotics/Exoskeletons.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信