{"title":"Automated facial expression recognition app development on smart phones using cloud computing","authors":"Humaid Alshamsi, Veton Këpuska, H. Meng","doi":"10.1109/UEMCON.2017.8249000","DOIUrl":null,"url":null,"abstract":"Automated human emotion detection is a topic of significant interest in the field of computer vision. Over the past decade, much emphasis has been on using facial expression recognition (FER) to extract emotion from facial expressions. In this paper, the proposed system presents a novel method of facial recognition based on the cloud model, in combination with the traditional facial expression system. The process of predicting emotions from facial expression images contains several stages. The first stage of this system is the pre-processing stage, which is applied by detecting the face in images and then resizing the images. The second stage involves extracting features from facial expression images using Facial Landmarks and Center of Gravity (COG) feature extraction algorithms, which generate the training and testing datasets that contain the expressions of Anger, Disgust, Fear, Happiness, Neutrality, Sadness, and Surprise. Support Vector Machine (SVM) classifiers are then used for the classification stage in order to predict the emotion. In addition, a Confusion Matrix (CM) technique is used to evaluate the performance of these classifiers. The proposed system is tested on CK+, JAFFE, and KDEF databases. However, the proposed system achieved a prediction rate of 96.3% when Facial Landmarks and the Center of Gravity (COG)+SVM method are used.","PeriodicalId":403890,"journal":{"name":"2017 IEEE 8th Annual Ubiquitous Computing, Electronics and Mobile Communication Conference (UEMCON)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE 8th Annual Ubiquitous Computing, Electronics and Mobile Communication Conference (UEMCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UEMCON.2017.8249000","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
Automated human emotion detection is a topic of significant interest in the field of computer vision. Over the past decade, much emphasis has been on using facial expression recognition (FER) to extract emotion from facial expressions. In this paper, the proposed system presents a novel method of facial recognition based on the cloud model, in combination with the traditional facial expression system. The process of predicting emotions from facial expression images contains several stages. The first stage of this system is the pre-processing stage, which is applied by detecting the face in images and then resizing the images. The second stage involves extracting features from facial expression images using Facial Landmarks and Center of Gravity (COG) feature extraction algorithms, which generate the training and testing datasets that contain the expressions of Anger, Disgust, Fear, Happiness, Neutrality, Sadness, and Surprise. Support Vector Machine (SVM) classifiers are then used for the classification stage in order to predict the emotion. In addition, a Confusion Matrix (CM) technique is used to evaluate the performance of these classifiers. The proposed system is tested on CK+, JAFFE, and KDEF databases. However, the proposed system achieved a prediction rate of 96.3% when Facial Landmarks and the Center of Gravity (COG)+SVM method are used.