Face detection by generating and selecting features based on Kullback-Leibler divergence

K. Morooka, Junya Arakawa, H. Nagahashi
{"title":"Face detection by generating and selecting features based on Kullback-Leibler divergence","authors":"K. Morooka, Junya Arakawa, H. Nagahashi","doi":"10.1002/ECJC.20347","DOIUrl":null,"url":null,"abstract":"Face detection from images is a complex and nonlinear problem due to the various kinds of face images. This problem is solved by conversion of the original feature vectors extracted from images into high-dimension feature vectors using nonlinear mapping, and then finding face/nonface discriminant functions in the mapping space. If such discriminant functions are based on the inner products of high-dimension vectors, such inner products can be easily obtained by substitute calculations of kernel functions in the original feature space. However, in conventional recognition algorithms using kernel functions, numerous features are required to improve recognition accuracy. This paper proposes a new face detection method that uses generation and selection of features on the basis of Kullback-Leibler divergence (KLD). KLD refers to a distance between the distributions of face and nonface data for certain features. Features with large KLD are used for face detection. Moreover, by evaluating the features based on their KLDs, we can generate new features, and deal with different kinds of features concurrently. In experiments, a classifier designed by the proposed method achieved high recognition performance, while using few features. © 2007 Wiley Periodicals, Inc. Electron Comm Jpn Pt 3, 90(10): 29– 39, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecjc.20347","PeriodicalId":100407,"journal":{"name":"Electronics and Communications in Japan (Part III: Fundamental Electronic Science)","volume":"62 1","pages":"29-39"},"PeriodicalIF":0.0000,"publicationDate":"2007-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics and Communications in Japan (Part III: Fundamental Electronic Science)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/ECJC.20347","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Face detection from images is a complex and nonlinear problem due to the various kinds of face images. This problem is solved by conversion of the original feature vectors extracted from images into high-dimension feature vectors using nonlinear mapping, and then finding face/nonface discriminant functions in the mapping space. If such discriminant functions are based on the inner products of high-dimension vectors, such inner products can be easily obtained by substitute calculations of kernel functions in the original feature space. However, in conventional recognition algorithms using kernel functions, numerous features are required to improve recognition accuracy. This paper proposes a new face detection method that uses generation and selection of features on the basis of Kullback-Leibler divergence (KLD). KLD refers to a distance between the distributions of face and nonface data for certain features. Features with large KLD are used for face detection. Moreover, by evaluating the features based on their KLDs, we can generate new features, and deal with different kinds of features concurrently. In experiments, a classifier designed by the proposed method achieved high recognition performance, while using few features. © 2007 Wiley Periodicals, Inc. Electron Comm Jpn Pt 3, 90(10): 29– 39, 2007; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecjc.20347
基于Kullback-Leibler散度生成和选择特征的人脸检测
由于人脸图像的种类繁多,从图像中进行人脸检测是一个复杂的非线性问题。该方法通过非线性映射将图像中提取的原始特征向量转换为高维特征向量,然后在映射空间中寻找人脸/非人脸判别函数来解决该问题。如果这种判别函数是基于高维向量的内积,那么这种内积可以很容易地在原始特征空间中通过核函数的代入计算得到。然而,在传统的核函数识别算法中,为了提高识别精度,需要大量的特征。提出了一种基于Kullback-Leibler散度(KLD)的特征生成和选择的人脸检测方法。KLD是指某些特征的人脸和非人脸数据分布之间的距离。具有较大KLD的特征用于人脸检测。此外,通过对特征的kld进行评估,我们可以生成新的特征,并同时处理不同类型的特征。在实验中,采用该方法设计的分类器在使用较少特征的情况下取得了较高的识别性能。©2007 Wiley期刊公司电子工程学报,2009,29 (3):393 - 398;在线发表于Wiley InterScience (www.interscience.wiley.com)。DOI 10.1002 / ecjc.20347
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信