基于深度传感器的机器人服装辅助人布拓扑关系实时估计

Nishanth Koganti, Tomoya Tamei, Takamitsu Matsubara, T. Shibata
{"title":"基于深度传感器的机器人服装辅助人布拓扑关系实时估计","authors":"Nishanth Koganti, Tomoya Tamei, Takamitsu Matsubara, T. Shibata","doi":"10.1109/ROMAN.2014.6926241","DOIUrl":null,"url":null,"abstract":"In this study, we propose a novel method for the real-time estimation of Human-Cloth relationship, which is crucial for efficient motor skill learning in Robotic Clothing Assistance. This system relies on the use of low cost depth sensor, which provides color and depth images without requiring an elaborate setup making it suitable for real-world applications. We present an efficient algorithm to estimate the parameters that represent the topological relationship between human and the clothing article. At the core of our approach are low dimensional representation of Human-Cloth relationship using topology coordinates for fast learning of motor skills and a unified ellipse fitting algorithm for the compact representation of the state of clothing articles. We conducted experiments that illustrate the robustness of these feature representations. Furthermore, we evaluated the performance of our proposed method by applying it to real-time clothing assistance tasks and compared the estimates provided by our method with the ground truth.","PeriodicalId":235810,"journal":{"name":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","volume":"119 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Real-time estimation of Human-Cloth topological relationship using depth sensor for robotic clothing assistance\",\"authors\":\"Nishanth Koganti, Tomoya Tamei, Takamitsu Matsubara, T. Shibata\",\"doi\":\"10.1109/ROMAN.2014.6926241\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this study, we propose a novel method for the real-time estimation of Human-Cloth relationship, which is crucial for efficient motor skill learning in Robotic Clothing Assistance. This system relies on the use of low cost depth sensor, which provides color and depth images without requiring an elaborate setup making it suitable for real-world applications. We present an efficient algorithm to estimate the parameters that represent the topological relationship between human and the clothing article. At the core of our approach are low dimensional representation of Human-Cloth relationship using topology coordinates for fast learning of motor skills and a unified ellipse fitting algorithm for the compact representation of the state of clothing articles. We conducted experiments that illustrate the robustness of these feature representations. Furthermore, we evaluated the performance of our proposed method by applying it to real-time clothing assistance tasks and compared the estimates provided by our method with the ground truth.\",\"PeriodicalId\":235810,\"journal\":{\"name\":\"The 23rd IEEE International Symposium on Robot and Human Interactive Communication\",\"volume\":\"119 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-10-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 23rd IEEE International Symposium on Robot and Human Interactive Communication\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROMAN.2014.6926241\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 23rd IEEE International Symposium on Robot and Human Interactive Communication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROMAN.2014.6926241","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

摘要

在这项研究中,我们提出了一种新的方法来实时估计人布关系,这对于机器人服装辅助中有效的运动技能学习至关重要。该系统依赖于低成本深度传感器的使用,无需精心设置即可提供颜色和深度图像,使其适合实际应用。我们提出了一种有效的算法来估计代表人与服装之间拓扑关系的参数。我们的方法的核心是使用拓扑坐标来快速学习运动技能的人布关系的低维表示和统一的椭圆拟合算法来紧凑表示服装物品的状态。我们进行了实验来说明这些特征表示的鲁棒性。此外,我们通过将我们提出的方法应用于实时服装援助任务来评估其性能,并将我们的方法提供的估计与地面真实情况进行比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Real-time estimation of Human-Cloth topological relationship using depth sensor for robotic clothing assistance
In this study, we propose a novel method for the real-time estimation of Human-Cloth relationship, which is crucial for efficient motor skill learning in Robotic Clothing Assistance. This system relies on the use of low cost depth sensor, which provides color and depth images without requiring an elaborate setup making it suitable for real-world applications. We present an efficient algorithm to estimate the parameters that represent the topological relationship between human and the clothing article. At the core of our approach are low dimensional representation of Human-Cloth relationship using topology coordinates for fast learning of motor skills and a unified ellipse fitting algorithm for the compact representation of the state of clothing articles. We conducted experiments that illustrate the robustness of these feature representations. Furthermore, we evaluated the performance of our proposed method by applying it to real-time clothing assistance tasks and compared the estimates provided by our method with the ground truth.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信