{"title":"面部动画框架的网络和移动平台","authors":"E. Mendi, Coskun Bayrak","doi":"10.1109/HEALTH.2011.6026785","DOIUrl":null,"url":null,"abstract":"In this paper, we present a realistic facial animation framework for web and mobile platforms. The proposed system converts the text into 3D face animation with synthetic voice, ensuring synchronization of the head and eye movements with emotions and word flow of a sentence. The expression tags embedded in the input sentences turn into given emotion on the face while the virtual face is speaking. The final face motion is obtained by interpolating the keyframes over time to generate transitions between facial expressions. Visual results of the animation are sufficient for web and mobile environments. The proposed system may contribute to the development of various new generation e-Health applications such as intelligent communication systems, human-machine interfaces and interfaces for handicapped people.","PeriodicalId":187103,"journal":{"name":"2011 IEEE 13th International Conference on e-Health Networking, Applications and Services","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Facial animation framework for web and mobile platforms\",\"authors\":\"E. Mendi, Coskun Bayrak\",\"doi\":\"10.1109/HEALTH.2011.6026785\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we present a realistic facial animation framework for web and mobile platforms. The proposed system converts the text into 3D face animation with synthetic voice, ensuring synchronization of the head and eye movements with emotions and word flow of a sentence. The expression tags embedded in the input sentences turn into given emotion on the face while the virtual face is speaking. The final face motion is obtained by interpolating the keyframes over time to generate transitions between facial expressions. Visual results of the animation are sufficient for web and mobile environments. The proposed system may contribute to the development of various new generation e-Health applications such as intelligent communication systems, human-machine interfaces and interfaces for handicapped people.\",\"PeriodicalId\":187103,\"journal\":{\"name\":\"2011 IEEE 13th International Conference on e-Health Networking, Applications and Services\",\"volume\":\"19 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-06-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 IEEE 13th International Conference on e-Health Networking, Applications and Services\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HEALTH.2011.6026785\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE 13th International Conference on e-Health Networking, Applications and Services","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HEALTH.2011.6026785","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Facial animation framework for web and mobile platforms
In this paper, we present a realistic facial animation framework for web and mobile platforms. The proposed system converts the text into 3D face animation with synthetic voice, ensuring synchronization of the head and eye movements with emotions and word flow of a sentence. The expression tags embedded in the input sentences turn into given emotion on the face while the virtual face is speaking. The final face motion is obtained by interpolating the keyframes over time to generate transitions between facial expressions. Visual results of the animation are sufficient for web and mobile environments. The proposed system may contribute to the development of various new generation e-Health applications such as intelligent communication systems, human-machine interfaces and interfaces for handicapped people.