Multi-label CNN based pedestrian attribute learning for soft biometrics

Jianqing Zhu, Shengcai Liao, Dong Yi, Zhen Lei, S. Li
{"title":"Multi-label CNN based pedestrian attribute learning for soft biometrics","authors":"Jianqing Zhu, Shengcai Liao, Dong Yi, Zhen Lei, S. Li","doi":"10.1109/ICB.2015.7139070","DOIUrl":null,"url":null,"abstract":"Recently, pedestrian attributes like gender, age and clothing etc., have been used as soft biometric traits for recognizing people. Unlike existing methods that assume the independence of attributes during their prediction, we propose a multi-label convolutional neural network (MLCNN) to predict multiple attributes together in a unified framework. Firstly, a pedestrian image is roughly divided into multiple overlapping body parts, which are simultaneously integrated in the multi-label convolutional neural network. Secondly, these parts are filtered independently and aggregated in the cost layer. The cost function is a combination of multiple binary attribute classification cost functions. Moreover, we propose an attribute assisted person re-identification method, which fuses attribute distances and low-level feature distances between pairs of person images to improve person re-identification performance. Extensive experiments show: 1) the average attribute classification accuracy of the proposed method is 5.2% and 9.3% higher than the SVM-based method on three public databases, VIPeR and GRID, respectively; 2) the proposed attribute assisted person re-identification method is superior to existing approaches.","PeriodicalId":237372,"journal":{"name":"2015 International Conference on Biometrics (ICB)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"121","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference on Biometrics (ICB)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICB.2015.7139070","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 121

Abstract

Recently, pedestrian attributes like gender, age and clothing etc., have been used as soft biometric traits for recognizing people. Unlike existing methods that assume the independence of attributes during their prediction, we propose a multi-label convolutional neural network (MLCNN) to predict multiple attributes together in a unified framework. Firstly, a pedestrian image is roughly divided into multiple overlapping body parts, which are simultaneously integrated in the multi-label convolutional neural network. Secondly, these parts are filtered independently and aggregated in the cost layer. The cost function is a combination of multiple binary attribute classification cost functions. Moreover, we propose an attribute assisted person re-identification method, which fuses attribute distances and low-level feature distances between pairs of person images to improve person re-identification performance. Extensive experiments show: 1) the average attribute classification accuracy of the proposed method is 5.2% and 9.3% higher than the SVM-based method on three public databases, VIPeR and GRID, respectively; 2) the proposed attribute assisted person re-identification method is superior to existing approaches.
基于多标签CNN的软生物特征行人属性学习
近年来,行人的性别、年龄、衣着等属性被作为识别人的软生物特征。与现有方法在预测过程中假设属性的独立性不同,我们提出了一种多标签卷积神经网络(MLCNN)来在统一的框架中预测多个属性。首先,将行人图像大致划分为多个重叠的身体部分,并在多标签卷积神经网络中同时进行整合;其次,对这些部分进行独立过滤,并在成本层进行聚合。代价函数是多个二元属性分类代价函数的组合。此外,我们提出了一种属性辅助的人物再识别方法,该方法融合了人物图像对之间的属性距离和底层特征距离,提高了人物再识别的性能。大量实验表明:1)在VIPeR和GRID三个公共数据库上,本文方法的平均属性分类准确率分别比基于支持向量机的方法高5.2%和9.3%;2)本文提出的属性辅助人再识别方法优于现有方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信