Ajjen Joshi, L. Tickle-Degnen, S. Gunnery, T. Ellis, Margrit Betke
{"title":"Predicting Active Facial Expressivity in People with Parkinson's Disease","authors":"Ajjen Joshi, L. Tickle-Degnen, S. Gunnery, T. Ellis, Margrit Betke","doi":"10.1145/2910674.2910686","DOIUrl":null,"url":null,"abstract":"Our capacity to engage in meaningful conversations depends on a multitude of communication signals, including verbal delivery of speech, tone and modulation of voice, execution of body gestures, and exhibition of a range of facial expressions. Among these cues, the expressivity of the face strongly indicates the level of one's engagement during a social interaction. It also significantly influences how others perceive one's personality and mood. Individuals with Parkinson's disease whose facial muscles have become rigid have difficulty exhibiting facial expressions. In this work, we investigate how to computationally predict an accurate and objective score for facial expressivity of a person. We present a method that computes geometric shape features of the face and predicts a score for facial expressivity. Our method trains a random forest regressor based on features extracted from a set of training videos of interviews of people suffering from Parkinson's disease. We evaluated our formulation on a dataset of 727 20-second video clips using 9-fold cross validation. We also provide insight on the geometric features that are important in this prediction task by computing variable importance scores for our features. Our results suggest that the dynamics of the eyes and eyebrows are better predictors of facial expressivity than dynamics of the mouth.","PeriodicalId":359504,"journal":{"name":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2910674.2910686","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
Our capacity to engage in meaningful conversations depends on a multitude of communication signals, including verbal delivery of speech, tone and modulation of voice, execution of body gestures, and exhibition of a range of facial expressions. Among these cues, the expressivity of the face strongly indicates the level of one's engagement during a social interaction. It also significantly influences how others perceive one's personality and mood. Individuals with Parkinson's disease whose facial muscles have become rigid have difficulty exhibiting facial expressions. In this work, we investigate how to computationally predict an accurate and objective score for facial expressivity of a person. We present a method that computes geometric shape features of the face and predicts a score for facial expressivity. Our method trains a random forest regressor based on features extracted from a set of training videos of interviews of people suffering from Parkinson's disease. We evaluated our formulation on a dataset of 727 20-second video clips using 9-fold cross validation. We also provide insight on the geometric features that are important in this prediction task by computing variable importance scores for our features. Our results suggest that the dynamics of the eyes and eyebrows are better predictors of facial expressivity than dynamics of the mouth.