{"title":"Multi-task Deep Learning for Fast Online Multiple Object Tracking","authors":"Yuqi Zhang, Yongzhen Huang, Liang Wang","doi":"10.1109/ACPR.2017.58","DOIUrl":null,"url":null,"abstract":"We present a multi-task deep learning framework to improve the performance of the Multiple Object Tracking (MOT) problem. Motion and appearance cues are ombined together to build an online multiple object tracker. While being accurate, our tracker also runs fast enough. We have made two major contributions in this paper: (1) Learn appearance features offline with triplet loss. (2) Train a quality-aware deep network by sharing convolutional features. The proposed online tracker achieves the state-of-art performance on the UA-DETRAC dataset [17] while being efficient in terms of running speed at the same time.","PeriodicalId":426561,"journal":{"name":"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)","volume":"43 24","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACPR.2017.58","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
We present a multi-task deep learning framework to improve the performance of the Multiple Object Tracking (MOT) problem. Motion and appearance cues are ombined together to build an online multiple object tracker. While being accurate, our tracker also runs fast enough. We have made two major contributions in this paper: (1) Learn appearance features offline with triplet loss. (2) Train a quality-aware deep network by sharing convolutional features. The proposed online tracker achieves the state-of-art performance on the UA-DETRAC dataset [17] while being efficient in terms of running speed at the same time.