Arne Sachtler, Korbinian Nottensteiner, M. Kassecker, A. Albu-Schäffer
{"title":"基于视觉与触觉相结合的圆形特征物体自主配准方法","authors":"Arne Sachtler, Korbinian Nottensteiner, M. Kassecker, A. Albu-Schäffer","doi":"10.1109/ICAR46387.2019.8981602","DOIUrl":null,"url":null,"abstract":"Future manufacturing systems will have to allow frequent conversion of production processes. In order to prevent a surge in setup times and the cost involved, we suggest refraining from specialized part feeders, fixture units and manual calibration routines. To this effect we regard autonomous object registration as corner stone to lower the manual calibration effort. In this work, we propose a framework for autonomous object registration in robotic workcells using a combination of visual and touch-based sensing. Vision systems in process automation often require very defined conditions to produce reliable pose estimates. Therefore, we combine data from a vision system with touch-based sensing in order to benefit from both sensing modalities and reduce the requirements on the vision system. We use a particle filter approach in order to estimate the pose of individual features and show how to retrieve the overall pose of an object in the workcell from the feature estimates. The observed feature distribution is used to autonomously trigger dedicated actions for touch-based probing with a lightweight robotic arm in order to increase the accuracy. In particular, we describe the detection of circular features and validated the framework in experiments with our robotic assembly system.","PeriodicalId":6606,"journal":{"name":"2019 19th International Conference on Advanced Robotics (ICAR)","volume":"4 1","pages":"426-433"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Combined Visual and Touch-based Sensing for the Autonomous Registration of Objects with Circular Features\",\"authors\":\"Arne Sachtler, Korbinian Nottensteiner, M. Kassecker, A. Albu-Schäffer\",\"doi\":\"10.1109/ICAR46387.2019.8981602\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Future manufacturing systems will have to allow frequent conversion of production processes. In order to prevent a surge in setup times and the cost involved, we suggest refraining from specialized part feeders, fixture units and manual calibration routines. To this effect we regard autonomous object registration as corner stone to lower the manual calibration effort. In this work, we propose a framework for autonomous object registration in robotic workcells using a combination of visual and touch-based sensing. Vision systems in process automation often require very defined conditions to produce reliable pose estimates. Therefore, we combine data from a vision system with touch-based sensing in order to benefit from both sensing modalities and reduce the requirements on the vision system. We use a particle filter approach in order to estimate the pose of individual features and show how to retrieve the overall pose of an object in the workcell from the feature estimates. The observed feature distribution is used to autonomously trigger dedicated actions for touch-based probing with a lightweight robotic arm in order to increase the accuracy. In particular, we describe the detection of circular features and validated the framework in experiments with our robotic assembly system.\",\"PeriodicalId\":6606,\"journal\":{\"name\":\"2019 19th International Conference on Advanced Robotics (ICAR)\",\"volume\":\"4 1\",\"pages\":\"426-433\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 19th International Conference on Advanced Robotics (ICAR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAR46387.2019.8981602\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 19th International Conference on Advanced Robotics (ICAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAR46387.2019.8981602","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Combined Visual and Touch-based Sensing for the Autonomous Registration of Objects with Circular Features
Future manufacturing systems will have to allow frequent conversion of production processes. In order to prevent a surge in setup times and the cost involved, we suggest refraining from specialized part feeders, fixture units and manual calibration routines. To this effect we regard autonomous object registration as corner stone to lower the manual calibration effort. In this work, we propose a framework for autonomous object registration in robotic workcells using a combination of visual and touch-based sensing. Vision systems in process automation often require very defined conditions to produce reliable pose estimates. Therefore, we combine data from a vision system with touch-based sensing in order to benefit from both sensing modalities and reduce the requirements on the vision system. We use a particle filter approach in order to estimate the pose of individual features and show how to retrieve the overall pose of an object in the workcell from the feature estimates. The observed feature distribution is used to autonomously trigger dedicated actions for touch-based probing with a lightweight robotic arm in order to increase the accuracy. In particular, we describe the detection of circular features and validated the framework in experiments with our robotic assembly system.