3D Face Reconstruction via Feature Point Depth Estimation and Shape Deformation

Quan Xiao, Lihua Han, Peizhong Liu
{"title":"3D Face Reconstruction via Feature Point Depth Estimation and Shape Deformation","authors":"Quan Xiao, Lihua Han, Peizhong Liu","doi":"10.1109/ICPR.2014.392","DOIUrl":null,"url":null,"abstract":"Since a human face can be represented by a few feature points (FPs) with less redundant information, and calculated by a linear combination of a small number of prototypical faces, we propose a two-step 3D face reconstruction approach including FP depth estimation and shape deformation. The proposed approach can reconstruct a realistic 3D face from a 2D frontal face image. In the first step, a coupled dictionary learning method based on sparse representation is employed to explore the underlying mappings between 2D and 3D training FPs, and then the depth of the FPs is estimated. In the second step, a novel shape deformation method is proposed to reconstruct the 3D face by combining a small number of most relevant deformed faces by the estimated FPs. The proposed approach can explore the distributions of 2D and 3D faces and the underlying mappings between them well, because human faces are represented by low-dimensional FPs, and their distributions are described by sparse representations. Moreover, it is much more flexible since we can make any change in any step. Extensive experiments are conducted on BJUT_3D database, and the results validate the effectiveness of the proposed approach.","PeriodicalId":142159,"journal":{"name":"2014 22nd International Conference on Pattern Recognition","volume":"152 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 22nd International Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPR.2014.392","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

Since a human face can be represented by a few feature points (FPs) with less redundant information, and calculated by a linear combination of a small number of prototypical faces, we propose a two-step 3D face reconstruction approach including FP depth estimation and shape deformation. The proposed approach can reconstruct a realistic 3D face from a 2D frontal face image. In the first step, a coupled dictionary learning method based on sparse representation is employed to explore the underlying mappings between 2D and 3D training FPs, and then the depth of the FPs is estimated. In the second step, a novel shape deformation method is proposed to reconstruct the 3D face by combining a small number of most relevant deformed faces by the estimated FPs. The proposed approach can explore the distributions of 2D and 3D faces and the underlying mappings between them well, because human faces are represented by low-dimensional FPs, and their distributions are described by sparse representations. Moreover, it is much more flexible since we can make any change in any step. Extensive experiments are conducted on BJUT_3D database, and the results validate the effectiveness of the proposed approach.
基于特征点深度估计和形状变形的三维人脸重建
由于人脸可以由具有较少冗余信息的几个特征点(FPs)来表示,并且可以通过少量原型人脸的线性组合来计算,因此我们提出了一种包含FP深度估计和形状变形的两步三维人脸重建方法。该方法可以从二维正面人脸图像中重建出逼真的三维人脸。首先,采用基于稀疏表示的耦合字典学习方法探索二维和三维训练FPs之间的底层映射,然后估计FPs的深度;在第二步中,提出了一种新的形状变形方法,通过估计的FPs组合少量最相关的变形人脸来重建三维人脸。由于人脸是由低维FPs表示的,并且其分布是由稀疏表示来描述的,因此该方法可以很好地探索二维和三维人脸的分布及其之间的底层映射。此外,它更加灵活,因为我们可以在任何步骤中进行任何更改。在BJUT_3D数据库上进行了大量实验,结果验证了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信