{"title":"Design and testing of a hybrid expressive face for a humanoid robot","authors":"D. Bazo, R. Vaidyanathan, A. Lenz, C. Melhuish","doi":"10.1109/IROS.2010.5651469","DOIUrl":null,"url":null,"abstract":"The BERT2 social robot, a platform for the exploration of human-robot interaction, is currently being built at the Bristol Robotics Laboratory. This paper describes work on the robot's face, a hybrid face composed of a plastic faceplate and an LCD display, and our implementation of facial expressions on this versatile platform. We report the implementation of two representations of affect space, each of which map the space of potential emotions to specific facial feature parameters and the results of a series of human-robot interaction experiments to characterize the recognizability of the robot's archetypal facial expressions. The tested subjects' recognition of the implemented facial expressions for happy, surprised, and sad was robust (with nearly 100% recognition). Subjects, however, tended to confuse the expressions for disgusted and afraid with other expressions, with correct recognition rates of 21.1% and 52.6% respectively. Future work involves the addition of more realistic eye movements for stronger recognition of certain responses. These results demonstrate that a hybrid face with affect space facial expression implementations can provide emotive conveyance readily recognized by human beings.","PeriodicalId":420658,"journal":{"name":"2010 IEEE/RSJ International Conference on Intelligent Robots and Systems","volume":"47 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"32","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 IEEE/RSJ International Conference on Intelligent Robots and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.2010.5651469","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 32
Abstract
The BERT2 social robot, a platform for the exploration of human-robot interaction, is currently being built at the Bristol Robotics Laboratory. This paper describes work on the robot's face, a hybrid face composed of a plastic faceplate and an LCD display, and our implementation of facial expressions on this versatile platform. We report the implementation of two representations of affect space, each of which map the space of potential emotions to specific facial feature parameters and the results of a series of human-robot interaction experiments to characterize the recognizability of the robot's archetypal facial expressions. The tested subjects' recognition of the implemented facial expressions for happy, surprised, and sad was robust (with nearly 100% recognition). Subjects, however, tended to confuse the expressions for disgusted and afraid with other expressions, with correct recognition rates of 21.1% and 52.6% respectively. Future work involves the addition of more realistic eye movements for stronger recognition of certain responses. These results demonstrate that a hybrid face with affect space facial expression implementations can provide emotive conveyance readily recognized by human beings.