S. Negahdaripour, L. Jin, Xun Xu, C. Tsukamoto, J. Yuh
{"title":"A real-time vision-based 3D motion estimation system for positioning and trajectory following","authors":"S. Negahdaripour, L. Jin, Xun Xu, C. Tsukamoto, J. Yuh","doi":"10.1109/ACV.1996.572067","DOIUrl":null,"url":null,"abstract":"The authors present a real-time vision-based system for automatic positioning and trajectory following, based on a direct method for 3D motion estimation. The spatio-temporal derivatives of the image function, calculated from time-varying imagery, are used to directly calculate the motion and position of the camera. For demonstration, they have implemented the system on a one-degree-of freedom thruster operating in a laboratory water tank. The estimated position information is communicated to the control system, a PID controller, in order to generate the appropriate signal to correct the thruster system's position. The performance of the vision system is demonstrated in selected experiments by comparing results with the data from an optical encoder position sensor.","PeriodicalId":222106,"journal":{"name":"Proceedings Third IEEE Workshop on Applications of Computer Vision. WACV'96","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1996-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Third IEEE Workshop on Applications of Computer Vision. WACV'96","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACV.1996.572067","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
The authors present a real-time vision-based system for automatic positioning and trajectory following, based on a direct method for 3D motion estimation. The spatio-temporal derivatives of the image function, calculated from time-varying imagery, are used to directly calculate the motion and position of the camera. For demonstration, they have implemented the system on a one-degree-of freedom thruster operating in a laboratory water tank. The estimated position information is communicated to the control system, a PID controller, in order to generate the appropriate signal to correct the thruster system's position. The performance of the vision system is demonstrated in selected experiments by comparing results with the data from an optical encoder position sensor.