{"title":"Automated Evaluation of Child Emotion Expression and Recognition Abilities","authors":"Sahdiya Suhan, Kovishwakarunya Kalaichelvan, Lahiru Samarage, D. Alahakoon, Pradeepa Samarasinghe, Madhuka Nadeeshani","doi":"10.1109/ICITSI56531.2022.9970990","DOIUrl":null,"url":null,"abstract":"Advancement of human computer interactions have led to the development of automated systems for the study of facial emotions. Currently, human intervention is required to identify the emotional state of an individual, thus our study proposes an automated process using a novel mobile application named Chezer which aims to identify the emotion expression and recognition abilities of children. The main objective of this study is to provide an affordable application to low income families by eliminating the need to visit clinics for child emotion related concerns. Chezer uses a multi-sensory gaming approach in developing games to identify the emotion expression and recognition ability of a child, where a level and a scenario based gaming methods are implemented respectively. The games include various methods such as visual and audio stimulation and quiz based assessments to aid a parent in analyzing the ability of their children. To evaluate the games, an emotion prediction model based on Convolution Neural Network which yields an accuracy of 90% is incorporated along with other models to detect valence, arousal and emotion levels which makes the study unique in the context of children. Valence and arousal detection models yield Root Mean Square Error scores of 0.23 and 0.05 respectively. LIRIS and EmoReact children video datasets were used for the model training and testing purposes. Further, the application was tested among the children at Sri Lanka which yielded promising results. Overall, combination of all these features in a single application makes the study novel.","PeriodicalId":439918,"journal":{"name":"2022 International Conference on Information Technology Systems and Innovation (ICITSI)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Information Technology Systems and Innovation (ICITSI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICITSI56531.2022.9970990","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Advancement of human computer interactions have led to the development of automated systems for the study of facial emotions. Currently, human intervention is required to identify the emotional state of an individual, thus our study proposes an automated process using a novel mobile application named Chezer which aims to identify the emotion expression and recognition abilities of children. The main objective of this study is to provide an affordable application to low income families by eliminating the need to visit clinics for child emotion related concerns. Chezer uses a multi-sensory gaming approach in developing games to identify the emotion expression and recognition ability of a child, where a level and a scenario based gaming methods are implemented respectively. The games include various methods such as visual and audio stimulation and quiz based assessments to aid a parent in analyzing the ability of their children. To evaluate the games, an emotion prediction model based on Convolution Neural Network which yields an accuracy of 90% is incorporated along with other models to detect valence, arousal and emotion levels which makes the study unique in the context of children. Valence and arousal detection models yield Root Mean Square Error scores of 0.23 and 0.05 respectively. LIRIS and EmoReact children video datasets were used for the model training and testing purposes. Further, the application was tested among the children at Sri Lanka which yielded promising results. Overall, combination of all these features in a single application makes the study novel.