Saba Baloch, Syed A. R. Syed Abu Bakar, M. Mokji, Saima Waseem, Adel Hafeezallah
{"title":"Affect recognition using simplistic 2D skeletal features from the upper body movement","authors":"Saba Baloch, Syed A. R. Syed Abu Bakar, M. Mokji, Saima Waseem, Adel Hafeezallah","doi":"10.1145/3582099.3582115","DOIUrl":null,"url":null,"abstract":"Over the past two decades, affective computing has garnered considerable attention. However, affective computing using body modality is still in its initial stages. Body affect detection using 3D skeletal data or motion capture data has seen some progress and produced promising results, but such advancement using RGB videos is yet to be achieved. In this paper, using OpenPose, 2D skeletal data is extracted from RGB videos. Joint location and joint angle features from MPIIEmo and GEMEP datasets are used to efficiently recognize affective states of angry, happy, sad, and surprise.","PeriodicalId":222372,"journal":{"name":"Proceedings of the 2022 5th Artificial Intelligence and Cloud Computing Conference","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th Artificial Intelligence and Cloud Computing Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3582099.3582115","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Over the past two decades, affective computing has garnered considerable attention. However, affective computing using body modality is still in its initial stages. Body affect detection using 3D skeletal data or motion capture data has seen some progress and produced promising results, but such advancement using RGB videos is yet to be achieved. In this paper, using OpenPose, 2D skeletal data is extracted from RGB videos. Joint location and joint angle features from MPIIEmo and GEMEP datasets are used to efficiently recognize affective states of angry, happy, sad, and surprise.