{"title":"Visual servoing of an articulated object based on stereovision","authors":"Fadi Alkhalil, L. Barbé, C. Doignon","doi":"10.1109/ROSE.2011.6058535","DOIUrl":null,"url":null,"abstract":"For about a decade and half, the tracking of nonrigid mechanical structures have been started to be investigated. For tasks with collaborative robots or obstacles avoidance with redundant arms, the motion tracking of the structure of interest is rather a 3-D tracking with moving visual sensors, and then, there is a need of including visual servoing techniques to tackle this kind of complex task. The objective of this work is to use stereo visual feedback for controlling both on-board cameras motion mounted on a robot but also the active joints of the viewed articulated target. Considering a kinematic analysis, the velocity screw is composed of a translational velocity and a rotational velocity and we present decoupled control laws to improve the convergence property of the servoing but also to account for the types of link to be servoed for the articulated object joint space. To assess this work, lines are used as visual cues since they are easy to segment from most images issued from man-made environments, as axes of many model-based human, animal or robot motions. Tasks functions based on Plücker coordinates are presented for this both joint-based and position-based visual servoing approach. Simulations and preliminary results from experiments with a eye-in-hand robotic platform and a motorized revolute joint are provided.","PeriodicalId":361472,"journal":{"name":"2011 IEEE International Symposium on Robotic and Sensors Environments (ROSE)","volume":"145 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE International Symposium on Robotic and Sensors Environments (ROSE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROSE.2011.6058535","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
For about a decade and half, the tracking of nonrigid mechanical structures have been started to be investigated. For tasks with collaborative robots or obstacles avoidance with redundant arms, the motion tracking of the structure of interest is rather a 3-D tracking with moving visual sensors, and then, there is a need of including visual servoing techniques to tackle this kind of complex task. The objective of this work is to use stereo visual feedback for controlling both on-board cameras motion mounted on a robot but also the active joints of the viewed articulated target. Considering a kinematic analysis, the velocity screw is composed of a translational velocity and a rotational velocity and we present decoupled control laws to improve the convergence property of the servoing but also to account for the types of link to be servoed for the articulated object joint space. To assess this work, lines are used as visual cues since they are easy to segment from most images issued from man-made environments, as axes of many model-based human, animal or robot motions. Tasks functions based on Plücker coordinates are presented for this both joint-based and position-based visual servoing approach. Simulations and preliminary results from experiments with a eye-in-hand robotic platform and a motorized revolute joint are provided.