基于视觉和触觉信息的基于强化学习的断裂物体机器人装配

Xinchao Song, Nikolas Lamb, Sean Banerjee, N. Banerjee
{"title":"基于视觉和触觉信息的基于强化学习的断裂物体机器人装配","authors":"Xinchao Song, Nikolas Lamb, Sean Banerjee, N. Banerjee","doi":"10.1109/ICARA56516.2023.10125938","DOIUrl":null,"url":null,"abstract":"Though several approaches exist to automatically generate repair parts for fractured objects, there has been little prior work on the automatic assembly of generated repair parts. Assembly of repair parts to fractured objects is a challenging problem due to the complex high-frequency geometry at the fractured region, which limits the effectiveness of traditional controllers. We present an approach using reinforcement learning that combines visual and tactile information to automatically assemble repair parts to fractured objects. Our approach overcomes the limitations of existing assembly approaches that require objects to have a specific structure, that require training on a large dataset to generalize to new objects, or that require the assembled state to be easily identifiable, such as for peg-in-hole assembly. We propose two visual metrics that provide estimation of assembly state with 3 degrees of freedom. Tactile information allows our approach to assemble objects under occlusion, as occurs when the objects are nearly assembled. Our approach is able to assemble objects with complex interfaces without placing requirements on object structure.","PeriodicalId":443572,"journal":{"name":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Reinforcement-Learning Based Robotic Assembly of Fractured Objects Using Visual and Tactile Information\",\"authors\":\"Xinchao Song, Nikolas Lamb, Sean Banerjee, N. Banerjee\",\"doi\":\"10.1109/ICARA56516.2023.10125938\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Though several approaches exist to automatically generate repair parts for fractured objects, there has been little prior work on the automatic assembly of generated repair parts. Assembly of repair parts to fractured objects is a challenging problem due to the complex high-frequency geometry at the fractured region, which limits the effectiveness of traditional controllers. We present an approach using reinforcement learning that combines visual and tactile information to automatically assemble repair parts to fractured objects. Our approach overcomes the limitations of existing assembly approaches that require objects to have a specific structure, that require training on a large dataset to generalize to new objects, or that require the assembled state to be easily identifiable, such as for peg-in-hole assembly. We propose two visual metrics that provide estimation of assembly state with 3 degrees of freedom. Tactile information allows our approach to assemble objects under occlusion, as occurs when the objects are nearly assembled. Our approach is able to assemble objects with complex interfaces without placing requirements on object structure.\",\"PeriodicalId\":443572,\"journal\":{\"name\":\"2023 9th International Conference on Automation, Robotics and Applications (ICARA)\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 9th International Conference on Automation, Robotics and Applications (ICARA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICARA56516.2023.10125938\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 9th International Conference on Automation, Robotics and Applications (ICARA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICARA56516.2023.10125938","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

虽然有几种方法可以自动生成断裂物体的修复部件,但对于生成的修复部件的自动组装,以前的工作很少。由于断裂区域具有复杂的高频几何结构,对断裂修复部件的装配是一个具有挑战性的问题,这限制了传统控制器的有效性。我们提出了一种使用强化学习的方法,该方法结合了视觉和触觉信息来自动组装断裂物体的修复部件。我们的方法克服了现有组装方法的局限性,这些方法要求对象具有特定的结构,需要在大型数据集上进行训练以推广到新对象,或者要求组装状态易于识别,例如用于钉入孔组装。我们提出了两个视觉指标,提供装配状态的估计与3个自由度。触觉信息允许我们的方法在遮挡下组装物体,就像物体接近组装时发生的那样。我们的方法能够组装具有复杂接口的对象,而无需对对象结构提出要求。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Reinforcement-Learning Based Robotic Assembly of Fractured Objects Using Visual and Tactile Information
Though several approaches exist to automatically generate repair parts for fractured objects, there has been little prior work on the automatic assembly of generated repair parts. Assembly of repair parts to fractured objects is a challenging problem due to the complex high-frequency geometry at the fractured region, which limits the effectiveness of traditional controllers. We present an approach using reinforcement learning that combines visual and tactile information to automatically assemble repair parts to fractured objects. Our approach overcomes the limitations of existing assembly approaches that require objects to have a specific structure, that require training on a large dataset to generalize to new objects, or that require the assembled state to be easily identifiable, such as for peg-in-hole assembly. We propose two visual metrics that provide estimation of assembly state with 3 degrees of freedom. Tactile information allows our approach to assemble objects under occlusion, as occurs when the objects are nearly assembled. Our approach is able to assemble objects with complex interfaces without placing requirements on object structure.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信