Autonomous table-cleaning from kinesthetic demonstrations using Deep Learning

Nino Cauli, Pedro Vicente, Jaeseok Kim, B. Damas, A. Bernardino, F. Cavallo, J. Santos-Victor
{"title":"Autonomous table-cleaning from kinesthetic demonstrations using Deep Learning","authors":"Nino Cauli, Pedro Vicente, Jaeseok Kim, B. Damas, A. Bernardino, F. Cavallo, J. Santos-Victor","doi":"10.1109/DEVLRN.2018.8761013","DOIUrl":null,"url":null,"abstract":"We address the problem of teaching a robot how to autonomously perform table-cleaning tasks in a robust way. In particular, we focus on wiping and sweeping a table with a tool (e.g., a sponge). For the training phase, we use a set of kinestethic demonstrations performed over a table. The recorded 2D table-space trajectories, together with the images acquired by the robot, are used to train a deep convolutional network that automatically learns the parameters of a Gaussian Mixture Model that represents the hand movement. After the learning stage, the network is fed with the current image showing the location/shape of the dirt or stain to clean. The robot is able to perform cleaning arm-movements, obtained through Gaussian Mixture Regression using the mixture parameters provided by the network. Invariance to the robot posture is achieved by applying a plane-projective transformation before inputting the images to the neural network; robustness to illumination changes and other disturbances is increased by considering an augmented data set. This improves the generalization properties of the neural network, enabling for instance its use with the left arm after being trained using trajectories acquired with the right arm. The system was tested on the iCub robot generating a cleaning behaviour similar to the one of human demonstrators.","PeriodicalId":236346,"journal":{"name":"2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","volume":"206 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DEVLRN.2018.8761013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

We address the problem of teaching a robot how to autonomously perform table-cleaning tasks in a robust way. In particular, we focus on wiping and sweeping a table with a tool (e.g., a sponge). For the training phase, we use a set of kinestethic demonstrations performed over a table. The recorded 2D table-space trajectories, together with the images acquired by the robot, are used to train a deep convolutional network that automatically learns the parameters of a Gaussian Mixture Model that represents the hand movement. After the learning stage, the network is fed with the current image showing the location/shape of the dirt or stain to clean. The robot is able to perform cleaning arm-movements, obtained through Gaussian Mixture Regression using the mixture parameters provided by the network. Invariance to the robot posture is achieved by applying a plane-projective transformation before inputting the images to the neural network; robustness to illumination changes and other disturbances is increased by considering an augmented data set. This improves the generalization properties of the neural network, enabling for instance its use with the left arm after being trained using trajectories acquired with the right arm. The system was tested on the iCub robot generating a cleaning behaviour similar to the one of human demonstrators.
使用深度学习的动觉演示自动清理桌子
我们解决了教机器人如何以稳健的方式自主执行桌子清洁任务的问题。特别是,我们专注于用工具(如海绵)擦拭和清扫桌子。在训练阶段,我们使用一组在桌子上进行的动感运动演示。记录的二维表空间轨迹与机器人获取的图像一起用于训练一个深度卷积网络,该网络自动学习代表手部运动的高斯混合模型的参数。在学习阶段之后,向网络输入显示待清洁污垢或污渍的位置/形状的当前图像。利用网络提供的混合参数进行高斯混合回归,机器人能够完成清洁手臂的动作。在将图像输入神经网络之前,通过平面投影变换实现机器人姿态的不变性;通过考虑增强的数据集,增强了对光照变化和其他干扰的鲁棒性。这提高了神经网络的泛化特性,例如,在使用右臂获得的轨迹进行训练后,可以将其用于左臂。该系统在iCub机器人上进行了测试,产生了类似于人类示范的清洁行为。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信