{"title":"Fast visual object tracking via distortion-suppressed correlation filtering","authors":"Tianyang Xu, Xiaojun Wu","doi":"10.1109/ISC2.2016.7580837","DOIUrl":null,"url":null,"abstract":"Visual object tracking is a basic research unit in the construction of smart cities, it focuses on establishing a dynamic appearance model to represent the target in complex scenarios. In this paper, a distortion-suppressed correlation filtering based tracking method (DSCFT) is proposed. Our approach tackles distortions caused by spatial similarity comparison and temporal appearance updating. We establish our method under a Bayesian framework, where spatial and temporal appearance are embedded in likelihood and prior respectively. Firstly, The spatial distortion is handled by modifying weight windows and utilizing a proposal selection strategy to better track targets under fast motion and background clutters. Secondly, temporal information is retained in updating stage as a prior to represent dynamic variations of the target. Moreover, a multi-scale filtering scheme is integrated when updating the temporal appearance to boost the scale sensitivity. Experimental results dedicate the effectiveness and robustness of our DSCFT on benchmark videos.","PeriodicalId":171503,"journal":{"name":"2016 IEEE International Smart Cities Conference (ISC2)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Smart Cities Conference (ISC2)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISC2.2016.7580837","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Visual object tracking is a basic research unit in the construction of smart cities, it focuses on establishing a dynamic appearance model to represent the target in complex scenarios. In this paper, a distortion-suppressed correlation filtering based tracking method (DSCFT) is proposed. Our approach tackles distortions caused by spatial similarity comparison and temporal appearance updating. We establish our method under a Bayesian framework, where spatial and temporal appearance are embedded in likelihood and prior respectively. Firstly, The spatial distortion is handled by modifying weight windows and utilizing a proposal selection strategy to better track targets under fast motion and background clutters. Secondly, temporal information is retained in updating stage as a prior to represent dynamic variations of the target. Moreover, a multi-scale filtering scheme is integrated when updating the temporal appearance to boost the scale sensitivity. Experimental results dedicate the effectiveness and robustness of our DSCFT on benchmark videos.