来自反思的摩擦:迁移学习方法

Piotr Kicki, K. Walas
{"title":"来自反思的摩擦:迁移学习方法","authors":"Piotr Kicki, K. Walas","doi":"10.1109/ICRAE48301.2019.9043793","DOIUrl":null,"url":null,"abstract":"Gathering knowledge about the world surrounding the robot is a crucial step towards the robot's autonomy. Part of that knowledge are the physical parameters of the objects, like stiffness, dumping or friction coefficients, which are critical for performing the interaction. Similarly to the human perception system, also for robots, vision is the sense that provides the most data, so one can consider whether it is possible to estimate the parameters mentioned above based on images. In this paper, we are proposing a new approach of estimating friction coefficient from vision, i.e. reflectance images. The solution is based on transfer learning. Understood here as the use of pre-trained networks to solve the friction estimation task. Our results surpass the state-off the art approach on a publicly available dataset. The paper first provides a short overview of the state of the art followed by the description of the dataset. Then, we describe our method and show the obtained results. Finally, the discussion of the results and conclusions are given.","PeriodicalId":270665,"journal":{"name":"2019 4th International Conference on Robotics and Automation Engineering (ICRAE)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Friction from Reflectance: Transfer Learning Approach\",\"authors\":\"Piotr Kicki, K. Walas\",\"doi\":\"10.1109/ICRAE48301.2019.9043793\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Gathering knowledge about the world surrounding the robot is a crucial step towards the robot's autonomy. Part of that knowledge are the physical parameters of the objects, like stiffness, dumping or friction coefficients, which are critical for performing the interaction. Similarly to the human perception system, also for robots, vision is the sense that provides the most data, so one can consider whether it is possible to estimate the parameters mentioned above based on images. In this paper, we are proposing a new approach of estimating friction coefficient from vision, i.e. reflectance images. The solution is based on transfer learning. Understood here as the use of pre-trained networks to solve the friction estimation task. Our results surpass the state-off the art approach on a publicly available dataset. The paper first provides a short overview of the state of the art followed by the description of the dataset. Then, we describe our method and show the obtained results. Finally, the discussion of the results and conclusions are given.\",\"PeriodicalId\":270665,\"journal\":{\"name\":\"2019 4th International Conference on Robotics and Automation Engineering (ICRAE)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 4th International Conference on Robotics and Automation Engineering (ICRAE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICRAE48301.2019.9043793\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 4th International Conference on Robotics and Automation Engineering (ICRAE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRAE48301.2019.9043793","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

收集机器人周围环境的知识是实现机器人自主的关键一步。这些知识的一部分是物体的物理参数,如刚度、倾斜度或摩擦系数,这对执行交互至关重要。与人类感知系统类似,对于机器人来说,视觉是提供最多数据的感觉,因此可以考虑是否有可能根据图像估计上述参数。在本文中,我们提出了一种从视觉估计摩擦系数的新方法,即反射图像。该解决方案基于迁移学习。这里理解为使用预训练的网络来解决摩擦估计任务。我们的结果超越了公共数据集上最先进的方法。本文首先简要概述了目前的技术状况,然后对数据集进行了描述。然后,我们描述了我们的方法,并给出了得到的结果。最后,对研究结果进行了讨论,并给出了结论。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Friction from Reflectance: Transfer Learning Approach
Gathering knowledge about the world surrounding the robot is a crucial step towards the robot's autonomy. Part of that knowledge are the physical parameters of the objects, like stiffness, dumping or friction coefficients, which are critical for performing the interaction. Similarly to the human perception system, also for robots, vision is the sense that provides the most data, so one can consider whether it is possible to estimate the parameters mentioned above based on images. In this paper, we are proposing a new approach of estimating friction coefficient from vision, i.e. reflectance images. The solution is based on transfer learning. Understood here as the use of pre-trained networks to solve the friction estimation task. Our results surpass the state-off the art approach on a publicly available dataset. The paper first provides a short overview of the state of the art followed by the description of the dataset. Then, we describe our method and show the obtained results. Finally, the discussion of the results and conclusions are given.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信