Sara Ali, Faisal Mehmood, Fahad Iqbal Khawaja, Y. Ayaz, Muhammad Sajid, M. B. Sial, Muhammad Faiq Malik, Kashif Javed
{"title":"人机交互:利用机器人动态肢体动作识别相似情感","authors":"Sara Ali, Faisal Mehmood, Fahad Iqbal Khawaja, Y. Ayaz, Muhammad Sajid, M. B. Sial, Muhammad Faiq Malik, Kashif Javed","doi":"10.1109/ICAI58407.2023.10136649","DOIUrl":null,"url":null,"abstract":"Natural social interaction between a human and robot requires perception ability of robot for complex social behaviors and in turn displaying the learnt emotional behavior during interaction. The emotions expressed with body language are one of the key parameters during Human-Robot Interaction (HRI). This research focuses on extending the concept of affective non-verbal human-robot interaction using full body gestures for closely resembling emotion in absence of intricate facial expressions. The movements of head, torso, legs, and arms are used to link body movements and gestures of a bi-pedal humanoid robot with 28 closely resembling emotions. Variation of speed, frequency, and joint angles of the robot are the key features used for expressing a specific emotion. This research uses Russell's circumplex model to define 8 primary emotional categories which are further grouped into closely associated emotions resembling each other. 33 participants were involved in experimentation to authenticate different emotions using body gestures of robot. Each participant performed 28 trials in which emotions were displayed at random to recognize the state of robot. The results showed an average accuracy of 79.69%. The study authenticates that perceived emotions are highly dependent on body movements, postures, and selected features. This design model can therefore be used for even conveying the closely resembling emotions.","PeriodicalId":161809,"journal":{"name":"2023 3rd International Conference on Artificial Intelligence (ICAI)","volume":"116 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Human Robot Interaction: Identifying Resembling Emotions Using Dynamic Body Gestures of Robot\",\"authors\":\"Sara Ali, Faisal Mehmood, Fahad Iqbal Khawaja, Y. Ayaz, Muhammad Sajid, M. B. Sial, Muhammad Faiq Malik, Kashif Javed\",\"doi\":\"10.1109/ICAI58407.2023.10136649\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Natural social interaction between a human and robot requires perception ability of robot for complex social behaviors and in turn displaying the learnt emotional behavior during interaction. The emotions expressed with body language are one of the key parameters during Human-Robot Interaction (HRI). This research focuses on extending the concept of affective non-verbal human-robot interaction using full body gestures for closely resembling emotion in absence of intricate facial expressions. The movements of head, torso, legs, and arms are used to link body movements and gestures of a bi-pedal humanoid robot with 28 closely resembling emotions. Variation of speed, frequency, and joint angles of the robot are the key features used for expressing a specific emotion. This research uses Russell's circumplex model to define 8 primary emotional categories which are further grouped into closely associated emotions resembling each other. 33 participants were involved in experimentation to authenticate different emotions using body gestures of robot. Each participant performed 28 trials in which emotions were displayed at random to recognize the state of robot. The results showed an average accuracy of 79.69%. The study authenticates that perceived emotions are highly dependent on body movements, postures, and selected features. This design model can therefore be used for even conveying the closely resembling emotions.\",\"PeriodicalId\":161809,\"journal\":{\"name\":\"2023 3rd International Conference on Artificial Intelligence (ICAI)\",\"volume\":\"116 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 3rd International Conference on Artificial Intelligence (ICAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAI58407.2023.10136649\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 3rd International Conference on Artificial Intelligence (ICAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAI58407.2023.10136649","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Human Robot Interaction: Identifying Resembling Emotions Using Dynamic Body Gestures of Robot
Natural social interaction between a human and robot requires perception ability of robot for complex social behaviors and in turn displaying the learnt emotional behavior during interaction. The emotions expressed with body language are one of the key parameters during Human-Robot Interaction (HRI). This research focuses on extending the concept of affective non-verbal human-robot interaction using full body gestures for closely resembling emotion in absence of intricate facial expressions. The movements of head, torso, legs, and arms are used to link body movements and gestures of a bi-pedal humanoid robot with 28 closely resembling emotions. Variation of speed, frequency, and joint angles of the robot are the key features used for expressing a specific emotion. This research uses Russell's circumplex model to define 8 primary emotional categories which are further grouped into closely associated emotions resembling each other. 33 participants were involved in experimentation to authenticate different emotions using body gestures of robot. Each participant performed 28 trials in which emotions were displayed at random to recognize the state of robot. The results showed an average accuracy of 79.69%. The study authenticates that perceived emotions are highly dependent on body movements, postures, and selected features. This design model can therefore be used for even conveying the closely resembling emotions.