合成性能驱动的面部动画

Q2 Computer Science
Chang-Wei LUO , Jun YU , Zeng-Fu WANG
{"title":"合成性能驱动的面部动画","authors":"Chang-Wei LUO ,&nbsp;Jun YU ,&nbsp;Zeng-Fu WANG","doi":"10.1016/S1874-1029(14)60361-X","DOIUrl":null,"url":null,"abstract":"<div><p>In this paper, we present a system for real-time performance-driven facial animation. With the system, the user can control the facial expression of a digital character by acting out the desired facial action in front of an ordinary camera. First, we create a muscle-based 3D face model. The muscle actuation parameters are used to animate the face model. To increase the reality of facial animation, the orbicularis oris in our face model is divided into the inner part and outer part. We also establish the relationship between jaw rotation and facial surface deformation. Second, a real-time facial tracking method is employed to track the facial features of a performer in the video. Finally, the tracked facial feature points are used to estimate muscle actuation parameters to drive the face model. Experimental results show that our system runs in real time and outputs realistic facial animations. Compared with most existing performance-based facial animation systems, ours does not require facial markers, intrusive lighting, or special scanning equipment, thus it is inexpensive and easy to use.</p></div>","PeriodicalId":35798,"journal":{"name":"自动化学报","volume":"40 10","pages":"Pages 2245-2252"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/S1874-1029(14)60361-X","citationCount":"3","resultStr":"{\"title\":\"Synthesizing Performance-driven Facial Animation\",\"authors\":\"Chang-Wei LUO ,&nbsp;Jun YU ,&nbsp;Zeng-Fu WANG\",\"doi\":\"10.1016/S1874-1029(14)60361-X\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In this paper, we present a system for real-time performance-driven facial animation. With the system, the user can control the facial expression of a digital character by acting out the desired facial action in front of an ordinary camera. First, we create a muscle-based 3D face model. The muscle actuation parameters are used to animate the face model. To increase the reality of facial animation, the orbicularis oris in our face model is divided into the inner part and outer part. We also establish the relationship between jaw rotation and facial surface deformation. Second, a real-time facial tracking method is employed to track the facial features of a performer in the video. Finally, the tracked facial feature points are used to estimate muscle actuation parameters to drive the face model. Experimental results show that our system runs in real time and outputs realistic facial animations. Compared with most existing performance-based facial animation systems, ours does not require facial markers, intrusive lighting, or special scanning equipment, thus it is inexpensive and easy to use.</p></div>\",\"PeriodicalId\":35798,\"journal\":{\"name\":\"自动化学报\",\"volume\":\"40 10\",\"pages\":\"Pages 2245-2252\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1016/S1874-1029(14)60361-X\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"自动化学报\",\"FirstCategoryId\":\"1093\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S187410291460361X\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"自动化学报","FirstCategoryId":"1093","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S187410291460361X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 3

摘要

在本文中,我们提出了一个实时性能驱动的面部动画系统。通过该系统,用户可以通过在普通摄像机前做出所需的面部动作来控制数字角色的面部表情。首先,我们创建一个基于肌肉的3D面部模型。肌肉驱动参数用于使面部模型动画化。为了增加面部动画的真实感,我们将人脸模型中的口轮匝肌分为内、外两部分。我们还建立了下颌旋转与面部变形之间的关系。其次,采用实时面部跟踪方法对视频中表演者的面部特征进行跟踪。最后,利用跟踪到的人脸特征点估计肌肉驱动参数来驱动人脸模型。实验结果表明,该系统能够实时运行并输出逼真的面部动画。与大多数现有的基于表演的面部动画系统相比,我们的系统不需要面部标记,侵入式照明或特殊的扫描设备,因此价格低廉且易于使用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Synthesizing Performance-driven Facial Animation

In this paper, we present a system for real-time performance-driven facial animation. With the system, the user can control the facial expression of a digital character by acting out the desired facial action in front of an ordinary camera. First, we create a muscle-based 3D face model. The muscle actuation parameters are used to animate the face model. To increase the reality of facial animation, the orbicularis oris in our face model is divided into the inner part and outer part. We also establish the relationship between jaw rotation and facial surface deformation. Second, a real-time facial tracking method is employed to track the facial features of a performer in the video. Finally, the tracked facial feature points are used to estimate muscle actuation parameters to drive the face model. Experimental results show that our system runs in real time and outputs realistic facial animations. Compared with most existing performance-based facial animation systems, ours does not require facial markers, intrusive lighting, or special scanning equipment, thus it is inexpensive and easy to use.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自动化学报
自动化学报 Computer Science-Computer Graphics and Computer-Aided Design
CiteScore
4.80
自引率
0.00%
发文量
6655
期刊介绍: ACTA AUTOMATICA SINICA is a joint publication of Chinese Association of Automation and the Institute of Automation, the Chinese Academy of Sciences. The objective is the high quality and rapid publication of the articles, with a strong focus on new trends, original theoretical and experimental research and developments, emerging technology, and industrial standards in automation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信