{"title":"Object Tracking Under Occlusion Using LGEM-Trained SSVM","authors":"Anqi Lin, Tingyang Wei, Wing W. Y. Ng","doi":"10.1109/ICWAPR.2018.8521283","DOIUrl":null,"url":null,"abstract":"Adaptive tracking-by-detection methods are widely used in computer vision for object tracking. Struck tracking is known as avoiding unclear intermediate labeling steps, and utilizing both the SMO and the budgeting mechanism for updating. However, the fixed budget is inflexible and heuristic, and the optimization-loop easily leads SVMs to overfitting, and the fixed combination of three kernels with specified features weakens extension capabilities. Furthermore, the quick update causes wrong learning under occlusion, considering previous “negative” samples as the current “positive” samples and drifting the tracker to that “neg-ative” samples. In this paper, we present a framework based on both the one-kernel Struck and the Localized Generalization Error Model (LGEM). By comparing the $Q$ values of Structured output SVM (SSVM) with different structures and con-troling the updating loops, a tradeoff the Optimality and Generalization is realized. Moreover, via measuring the fluctuation on $Q$ value, suitable new samples are selected for updating, tracking is simplified into using a single Gaussian kernel for further potential extension. As a result, a more generalized, occlusion-overcoming tracker is constructed. Experimentally, our algorithm is shown to be able to outperform state-of-the-art trackers on occlusion handle on various benchmark videos.","PeriodicalId":385478,"journal":{"name":"2018 International Conference on Wavelet Analysis and Pattern Recognition (ICWAPR)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 International Conference on Wavelet Analysis and Pattern Recognition (ICWAPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICWAPR.2018.8521283","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Adaptive tracking-by-detection methods are widely used in computer vision for object tracking. Struck tracking is known as avoiding unclear intermediate labeling steps, and utilizing both the SMO and the budgeting mechanism for updating. However, the fixed budget is inflexible and heuristic, and the optimization-loop easily leads SVMs to overfitting, and the fixed combination of three kernels with specified features weakens extension capabilities. Furthermore, the quick update causes wrong learning under occlusion, considering previous “negative” samples as the current “positive” samples and drifting the tracker to that “neg-ative” samples. In this paper, we present a framework based on both the one-kernel Struck and the Localized Generalization Error Model (LGEM). By comparing the $Q$ values of Structured output SVM (SSVM) with different structures and con-troling the updating loops, a tradeoff the Optimality and Generalization is realized. Moreover, via measuring the fluctuation on $Q$ value, suitable new samples are selected for updating, tracking is simplified into using a single Gaussian kernel for further potential extension. As a result, a more generalized, occlusion-overcoming tracker is constructed. Experimentally, our algorithm is shown to be able to outperform state-of-the-art trackers on occlusion handle on various benchmark videos.