Reinforcement Learning With Stereo-View Observation for Robust Electronic Component Robotic Insertion

Grzegorz Bartyzel, Wojciech Półchłopek, Dominik Rzepka
{"title":"Reinforcement Learning With Stereo-View Observation for Robust Electronic Component Robotic Insertion","authors":"Grzegorz Bartyzel, Wojciech Półchłopek, Dominik Rzepka","doi":"10.1007/s10846-023-01970-8","DOIUrl":null,"url":null,"abstract":"Abstract In modern manufacturing, assembly tasks are a major challenge for robotics. In the manufacturing industry, a wide range of insertion tasks can be found, from peg-in-hole insertion to electronic parts assembly. Robotic stations designed for this problem often use conventional hybrid force-position control to perform preprogrammed trajectories, such as e.g. a spiral path. However, electronic parts require more sophisticated techniques due to their complex geometry and susceptibility to damage. Production line assembly tasks require high robustness to initial position and rotation variations due to component grip imperfections. Robustness to partially obscured camera view is also mandatory due to multi stage assembly process. We propose a stereo-view method based on reinforcement learning (RL) for the robust assembly of electronic parts. Applicability of our method to real-world production lines is verified through test scenarios. Our approach is the most robust to applied perturbations of all tested methods and can potentially be transferred to environments unseen during learning.","PeriodicalId":404612,"journal":{"name":"Journal of Intelligent and Robotic Systems","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Intelligent and Robotic Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s10846-023-01970-8","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract In modern manufacturing, assembly tasks are a major challenge for robotics. In the manufacturing industry, a wide range of insertion tasks can be found, from peg-in-hole insertion to electronic parts assembly. Robotic stations designed for this problem often use conventional hybrid force-position control to perform preprogrammed trajectories, such as e.g. a spiral path. However, electronic parts require more sophisticated techniques due to their complex geometry and susceptibility to damage. Production line assembly tasks require high robustness to initial position and rotation variations due to component grip imperfections. Robustness to partially obscured camera view is also mandatory due to multi stage assembly process. We propose a stereo-view method based on reinforcement learning (RL) for the robust assembly of electronic parts. Applicability of our method to real-world production lines is verified through test scenarios. Our approach is the most robust to applied perturbations of all tested methods and can potentially be transferred to environments unseen during learning.
基于立体观察的鲁棒电子元件机器人插入强化学习
在现代制造业中,装配任务是机器人面临的主要挑战。在制造业中,可以找到广泛的插入任务,从钉孔插入到电子零件组装。针对这个问题设计的机器人站通常使用传统的混合力-位置控制来执行预编程轨迹,例如螺旋路径。然而,电子部件由于其复杂的几何形状和易损性,需要更复杂的技术。生产线装配任务要求对初始位置和旋转变化具有较高的鲁棒性,这是由于组件抓地力缺陷造成的。由于多阶段装配过程,对部分遮挡的相机视图的鲁棒性也是强制性的。提出了一种基于强化学习(RL)的立体视觉方法,用于电子零件的鲁棒装配。通过测试场景验证了我们的方法对实际生产线的适用性。我们的方法是所有测试方法中对应用扰动最稳健的,并且可以潜在地转移到学习过程中看不到的环境中。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信