{"title":"Intensity-difference based 3D video stabilization for planetary robots","authors":"G. Martinez","doi":"10.1109/ICEEE.2016.7751226","DOIUrl":null,"url":null,"abstract":"In this paper, an algorithm to remove the camera jitter from image sequences captured by planetary robots is investigated. First, the frame to frame surface 3D motion with respect to the camera coordinate system is estimated and integrated over time. The estimation is performed by maximizing a likelihood function of the frame to frame intensity differences measured at key observation points. Then, the jitter is determined as the perspective projection of the difference between the integrated surface 3D translation and a smoothed version of it. Finally, the stabilized video is synthesized by moving the entire content of each image with a displacement vector having the same magnitude but opposite direction to the estimated jitter for that image. The experimental results with synthetic data revealed real time operation with low latency and a reduction of the jitter in a factor of 20. Experimental results with real image sequences captured by a rover platform in indoor and outdoor conditions show very reliable and encouraging stabilization results.","PeriodicalId":285464,"journal":{"name":"2016 13th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE)","volume":"111 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 13th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICEEE.2016.7751226","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, an algorithm to remove the camera jitter from image sequences captured by planetary robots is investigated. First, the frame to frame surface 3D motion with respect to the camera coordinate system is estimated and integrated over time. The estimation is performed by maximizing a likelihood function of the frame to frame intensity differences measured at key observation points. Then, the jitter is determined as the perspective projection of the difference between the integrated surface 3D translation and a smoothed version of it. Finally, the stabilized video is synthesized by moving the entire content of each image with a displacement vector having the same magnitude but opposite direction to the estimated jitter for that image. The experimental results with synthetic data revealed real time operation with low latency and a reduction of the jitter in a factor of 20. Experimental results with real image sequences captured by a rover platform in indoor and outdoor conditions show very reliable and encouraging stabilization results.