{"title":"Feature-based Deep Learning of Proprioceptive Models for Robotic Force Estimation","authors":"Erik Berger, Alexander Uhlig","doi":"10.1109/HUMANOIDS47582.2021.9555682","DOIUrl":null,"url":null,"abstract":"Safe and meaningful interaction with robotic systems during behavior execution requires accurate sensing capabilities. This can be achieved by the usage of force-torque sensors which are often heavy, expensive, and require an additional power supply. Consequently, providing accurate sensing capabilities to lightweight robots, with a limited amount of load, is a challenging task. Furthermore, such sensors are not able to distinguish between task-specific regular forces and external influences as induced by human co-workers. To solve this, robots often rely on a large number of manually generated rules which is a time-consuming procedure. This paper presents a data-driven machine learning approach that enhances robotic behavior with estimates of the expected proprioceptive forces (intrinsic) and unexpected forces (extrinsic) exerted by the environment. First, the robot’s common internal sensors are recorded together with ground truth measurements of the actual forces during regular and perturbed behavior executions. The resulting data is used to generate features that contain a compact representation of behavior-specific intrinsic and extrinsic fluctuations. Those features are then utilized for deep learning of proprioceptive models which enables a robot to accurately distinguish the amount of intrinsic and extrinsic forces. Experiments performed with the UR5 robot show a substantial improvement in accuracy over force values provided by previous research.","PeriodicalId":320510,"journal":{"name":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE-RAS 20th International Conference on Humanoid Robots (Humanoids)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HUMANOIDS47582.2021.9555682","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Safe and meaningful interaction with robotic systems during behavior execution requires accurate sensing capabilities. This can be achieved by the usage of force-torque sensors which are often heavy, expensive, and require an additional power supply. Consequently, providing accurate sensing capabilities to lightweight robots, with a limited amount of load, is a challenging task. Furthermore, such sensors are not able to distinguish between task-specific regular forces and external influences as induced by human co-workers. To solve this, robots often rely on a large number of manually generated rules which is a time-consuming procedure. This paper presents a data-driven machine learning approach that enhances robotic behavior with estimates of the expected proprioceptive forces (intrinsic) and unexpected forces (extrinsic) exerted by the environment. First, the robot’s common internal sensors are recorded together with ground truth measurements of the actual forces during regular and perturbed behavior executions. The resulting data is used to generate features that contain a compact representation of behavior-specific intrinsic and extrinsic fluctuations. Those features are then utilized for deep learning of proprioceptive models which enables a robot to accurately distinguish the amount of intrinsic and extrinsic forces. Experiments performed with the UR5 robot show a substantial improvement in accuracy over force values provided by previous research.