{"title":"测量观察者对情感视频的EDA反应","authors":"J. Rahman, Md. Zakir Hossain, Tom Gedeon","doi":"10.1145/3369457.3369516","DOIUrl":null,"url":null,"abstract":"Future human computing research could be enriched by enabling the computer to recognize emotional states from observers' physiological activities. In this paper, observers' electrodermal activities (EDA) are analyzed to recognize 7 emotional categories while watching total of 80 emotional videos. Twenty participants participated as observers and 16 features were extracted from each video's respective EDA signal after a few processing steps. Mean analysis shows that a few emotions are significantly different from each other, but not all of them. Our generated arousal model on this dataset with these participants using their EDA responses also differs a little from the abstract models proposed in the literature. Finally, leave-one-observer-out approach and neural network classifier were employed to measure the performance, and the classifier reaches up to 94.8% correctness at the seven-class problem. The high accuracy inspires the potential of this system to use in future for recognizing emotions from observers' physiology in human computer interaction settings. Our generation of an arousal model for a specific setting has potential for investigating potential bias in dataset selection via measuring participant responses to that dataset.","PeriodicalId":258766,"journal":{"name":"Proceedings of the 31st Australian Conference on Human-Computer-Interaction","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Measuring Observers' EDA Responses to Emotional Videos\",\"authors\":\"J. Rahman, Md. Zakir Hossain, Tom Gedeon\",\"doi\":\"10.1145/3369457.3369516\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Future human computing research could be enriched by enabling the computer to recognize emotional states from observers' physiological activities. In this paper, observers' electrodermal activities (EDA) are analyzed to recognize 7 emotional categories while watching total of 80 emotional videos. Twenty participants participated as observers and 16 features were extracted from each video's respective EDA signal after a few processing steps. Mean analysis shows that a few emotions are significantly different from each other, but not all of them. Our generated arousal model on this dataset with these participants using their EDA responses also differs a little from the abstract models proposed in the literature. Finally, leave-one-observer-out approach and neural network classifier were employed to measure the performance, and the classifier reaches up to 94.8% correctness at the seven-class problem. The high accuracy inspires the potential of this system to use in future for recognizing emotions from observers' physiology in human computer interaction settings. Our generation of an arousal model for a specific setting has potential for investigating potential bias in dataset selection via measuring participant responses to that dataset.\",\"PeriodicalId\":258766,\"journal\":{\"name\":\"Proceedings of the 31st Australian Conference on Human-Computer-Interaction\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 31st Australian Conference on Human-Computer-Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3369457.3369516\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 31st Australian Conference on Human-Computer-Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3369457.3369516","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Measuring Observers' EDA Responses to Emotional Videos
Future human computing research could be enriched by enabling the computer to recognize emotional states from observers' physiological activities. In this paper, observers' electrodermal activities (EDA) are analyzed to recognize 7 emotional categories while watching total of 80 emotional videos. Twenty participants participated as observers and 16 features were extracted from each video's respective EDA signal after a few processing steps. Mean analysis shows that a few emotions are significantly different from each other, but not all of them. Our generated arousal model on this dataset with these participants using their EDA responses also differs a little from the abstract models proposed in the literature. Finally, leave-one-observer-out approach and neural network classifier were employed to measure the performance, and the classifier reaches up to 94.8% correctness at the seven-class problem. The high accuracy inspires the potential of this system to use in future for recognizing emotions from observers' physiology in human computer interaction settings. Our generation of an arousal model for a specific setting has potential for investigating potential bias in dataset selection via measuring participant responses to that dataset.