{"title":"Game Character Facial Animation Using Actor Video Corpus and Recurrent Neural Networks","authors":"Sheldon Schiffer","doi":"10.1109/ICMLA52953.2021.00113","DOIUrl":null,"url":null,"abstract":"Creating photorealistic facial animation for game characters is a labor-intensive process that gives authorial primacy to animators. This research presents an experimental autonomous animation controller based on an emotion model that uses a team of embedded recurrent neural networks (RNNs). The design is a novel alternative method that can elevate an actor’s contribution to game character design. This research presents the first results of combining a facial emotion neural network model with a workflow that incorporates actor preparation methods and the training of auto-regressive bi-directional RNNs with long short-term memory (LSTM) cells. The predicted emotion vectors triggered by player facial stimuli strongly resemble a performing actor for a game character with accuracies over 80% for targeted emotion labels and show accuracy near or above a high baseline standard.","PeriodicalId":6750,"journal":{"name":"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"9 6","pages":"674-681"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA52953.2021.00113","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Creating photorealistic facial animation for game characters is a labor-intensive process that gives authorial primacy to animators. This research presents an experimental autonomous animation controller based on an emotion model that uses a team of embedded recurrent neural networks (RNNs). The design is a novel alternative method that can elevate an actor’s contribution to game character design. This research presents the first results of combining a facial emotion neural network model with a workflow that incorporates actor preparation methods and the training of auto-regressive bi-directional RNNs with long short-term memory (LSTM) cells. The predicted emotion vectors triggered by player facial stimuli strongly resemble a performing actor for a game character with accuracies over 80% for targeted emotion labels and show accuracy near or above a high baseline standard.