{"title":"Robust particle PHD filter with sparse representation for multi-target tracking","authors":"Zeyu Fu, P. Feng, S. M. Naqvi, J. Chambers","doi":"10.1109/ICDSP.2016.7868562","DOIUrl":null,"url":null,"abstract":"Recently, sparse representation has been widely used in computer vision and visual tracking applications, including face recognition and object tracking. In this paper, we propose a novel robust multi-target tracking method by applying sparse representation in a particle probability hypothesis density (PHD) filter framework. We employ the dictionary learning method and principle component analysis (PCA) to train a static appearance model offline with sufficient training data. This pre-trained dictionary contains both colour histogram and oriented gradient histogram (HOG) features based on foreground target appearances. The tracker combines the pre-trained dictionary and sparse coding to discriminate the tracked target from background clutter. The sparse coefficients solved by ℓ1-minimization are employed to generate the likelihood function values, which are further applied in the update step of the proposed particle PHD filter. The proposed particle PHD filter is validated on two video sequences from publicly available CAVIAR and PETS2009 datasets, and demonstrates improved tracking performance in comparison with the traditional particle PHD filter.","PeriodicalId":206199,"journal":{"name":"2016 IEEE International Conference on Digital Signal Processing (DSP)","volume":"113 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 IEEE International Conference on Digital Signal Processing (DSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDSP.2016.7868562","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Recently, sparse representation has been widely used in computer vision and visual tracking applications, including face recognition and object tracking. In this paper, we propose a novel robust multi-target tracking method by applying sparse representation in a particle probability hypothesis density (PHD) filter framework. We employ the dictionary learning method and principle component analysis (PCA) to train a static appearance model offline with sufficient training data. This pre-trained dictionary contains both colour histogram and oriented gradient histogram (HOG) features based on foreground target appearances. The tracker combines the pre-trained dictionary and sparse coding to discriminate the tracked target from background clutter. The sparse coefficients solved by ℓ1-minimization are employed to generate the likelihood function values, which are further applied in the update step of the proposed particle PHD filter. The proposed particle PHD filter is validated on two video sequences from publicly available CAVIAR and PETS2009 datasets, and demonstrates improved tracking performance in comparison with the traditional particle PHD filter.