Yazhe Tang, Shaorong Xie, F. Lin, Jianyu Yang, Youfu Li
{"title":"Equivalent projection based distortion invariant visual tracking for omnidirectional vision","authors":"Yazhe Tang, Shaorong Xie, F. Lin, Jianyu Yang, Youfu Li","doi":"10.1109/ROBIO.2015.7418831","DOIUrl":null,"url":null,"abstract":"Catadioptric omnidirectional images suffer from serious distortions because of quadratic mirrors involved. For that reason, most of visual features developed on the basis of the perspective model are difficult to achieve a satisfactory performance when directly applied to the omnidirectional image. To accurately calculate the deformed target neighborhood, this paper employs equivalent projection approach to effectively formulate the distortion of omnidirectional camera. On the basis of equivalent projection, this paper presents a distortion invariant multi-feature fusion method for robust feature representation in omnidirectional image. Given the Gaussian Mixture Model (GMM), multiple features can be integrated into a whole probability framework. In other words, GMM transforms the problem of features matching into the multi-channel clustering. The fragment-based tracking framework can robustly handle the partial occlusion relying on an adaptive weight metric mechanism. Finally, a series of experiments will be presented to validate the performance of the proposed algorithm.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO.2015.7418831","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Catadioptric omnidirectional images suffer from serious distortions because of quadratic mirrors involved. For that reason, most of visual features developed on the basis of the perspective model are difficult to achieve a satisfactory performance when directly applied to the omnidirectional image. To accurately calculate the deformed target neighborhood, this paper employs equivalent projection approach to effectively formulate the distortion of omnidirectional camera. On the basis of equivalent projection, this paper presents a distortion invariant multi-feature fusion method for robust feature representation in omnidirectional image. Given the Gaussian Mixture Model (GMM), multiple features can be integrated into a whole probability framework. In other words, GMM transforms the problem of features matching into the multi-channel clustering. The fragment-based tracking framework can robustly handle the partial occlusion relying on an adaptive weight metric mechanism. Finally, a series of experiments will be presented to validate the performance of the proposed algorithm.