Chi-Yi Tsai, K. Song, X. Dutoit, H. Brussel, M. Nuttin
{"title":"基于自整定卡尔曼滤波的鲁棒移动机器人视觉跟踪控制系统","authors":"Chi-Yi Tsai, K. Song, X. Dutoit, H. Brussel, M. Nuttin","doi":"10.1109/CIRA.2007.382860","DOIUrl":null,"url":null,"abstract":"This paper presents a novel design of a robust visual tracking control system, which consists of a visual tracking controller and a visual state estimator. This system facilitates human-robot interaction of a unicycle-modeled mobile robot equipped with a tilt camera. Based on a novel dual-Jacobian visual interaction model, a dynamic motion target can be tracked using a single visual tracking controller without target's 3D velocity information. The visual state estimator aims to estimate the optimal system state and target image velocity, which is used later by the visual tracking controller. To achieve this, a self-tuning Kalman filter is proposed to estimate interesting parameters online in real-time. Further, because the proposed method is fully working in image space, the computational complexity and the sensor/camera modeling errors can be reduced. Experimental results validate the effectiveness of the proposed method, in terms of tracking performance, system convergence, and robustness.","PeriodicalId":301626,"journal":{"name":"2007 International Symposium on Computational Intelligence in Robotics and Automation","volume":"46 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":"{\"title\":\"Robust Mobile Robot Visual Tracking Control System Using Self-Tuning Kalman Filter\",\"authors\":\"Chi-Yi Tsai, K. Song, X. Dutoit, H. Brussel, M. Nuttin\",\"doi\":\"10.1109/CIRA.2007.382860\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents a novel design of a robust visual tracking control system, which consists of a visual tracking controller and a visual state estimator. This system facilitates human-robot interaction of a unicycle-modeled mobile robot equipped with a tilt camera. Based on a novel dual-Jacobian visual interaction model, a dynamic motion target can be tracked using a single visual tracking controller without target's 3D velocity information. The visual state estimator aims to estimate the optimal system state and target image velocity, which is used later by the visual tracking controller. To achieve this, a self-tuning Kalman filter is proposed to estimate interesting parameters online in real-time. Further, because the proposed method is fully working in image space, the computational complexity and the sensor/camera modeling errors can be reduced. Experimental results validate the effectiveness of the proposed method, in terms of tracking performance, system convergence, and robustness.\",\"PeriodicalId\":301626,\"journal\":{\"name\":\"2007 International Symposium on Computational Intelligence in Robotics and Automation\",\"volume\":\"46 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"22\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 International Symposium on Computational Intelligence in Robotics and Automation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIRA.2007.382860\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 International Symposium on Computational Intelligence in Robotics and Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIRA.2007.382860","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Robust Mobile Robot Visual Tracking Control System Using Self-Tuning Kalman Filter
This paper presents a novel design of a robust visual tracking control system, which consists of a visual tracking controller and a visual state estimator. This system facilitates human-robot interaction of a unicycle-modeled mobile robot equipped with a tilt camera. Based on a novel dual-Jacobian visual interaction model, a dynamic motion target can be tracked using a single visual tracking controller without target's 3D velocity information. The visual state estimator aims to estimate the optimal system state and target image velocity, which is used later by the visual tracking controller. To achieve this, a self-tuning Kalman filter is proposed to estimate interesting parameters online in real-time. Further, because the proposed method is fully working in image space, the computational complexity and the sensor/camera modeling errors can be reduced. Experimental results validate the effectiveness of the proposed method, in terms of tracking performance, system convergence, and robustness.