{"title":"Emotion Recognition through Gait on Mobile Devices","authors":"Mangtik Chiu, Jiayu Shu, P. Hui","doi":"10.1109/PERCOMW.2018.8480374","DOIUrl":null,"url":null,"abstract":"Building systems that have the ability to recognize human emotions has attracted much interest in recent years. Common approaches toward machine emotion recognition focus on detection of facial expressions and analysis of physiological signals. However, in situations where these features cannot be easily obtained, emotion recognition becomes a challenging problem. In this paper, we explore the possibility of emotion recognition through gait, which is one of the most common human behaviors. We first identify various motion features based on pose estimation from captured video frames. We then train several supervised learning models, including SVM, Multilayer Perceptron, Naive Bayes, Decision Tree, Random Forest and Logistic Regression, using selected features and compare their performances. The best model trained to classify five emotion labels has an accuracy of 64%. Finally, we implement a proof-of-concept mobile-server system for emotion recognition in real-life scenarios using smartphone cameras.","PeriodicalId":190096,"journal":{"name":"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOMW.2018.8480374","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16
Abstract
Building systems that have the ability to recognize human emotions has attracted much interest in recent years. Common approaches toward machine emotion recognition focus on detection of facial expressions and analysis of physiological signals. However, in situations where these features cannot be easily obtained, emotion recognition becomes a challenging problem. In this paper, we explore the possibility of emotion recognition through gait, which is one of the most common human behaviors. We first identify various motion features based on pose estimation from captured video frames. We then train several supervised learning models, including SVM, Multilayer Perceptron, Naive Bayes, Decision Tree, Random Forest and Logistic Regression, using selected features and compare their performances. The best model trained to classify five emotion labels has an accuracy of 64%. Finally, we implement a proof-of-concept mobile-server system for emotion recognition in real-life scenarios using smartphone cameras.