Yuanchao Li, C. Ishi, Nigel G. Ward, K. Inoue, Shizuka Nakamura, K. Takanashi, Tatsuya Kawahara
{"title":"Emotion recognition by combining prosody and sentiment analysis for expressing reactive emotion by humanoid robot","authors":"Yuanchao Li, C. Ishi, Nigel G. Ward, K. Inoue, Shizuka Nakamura, K. Takanashi, Tatsuya Kawahara","doi":"10.1109/APSIPA.2017.8282243","DOIUrl":null,"url":null,"abstract":"In order to achieve rapport in human-robot interaction, it is important to express a reactive emotion that matches with the user's mental state. This paper addresses an emotion recognition method which combines prosody and sentiment analysis for the system to properly express reactive emotion. In the user emotion recognition module, valence estimation from prosodic features is combined with sentiment analysis of text information. Combining the two information sources significantly improved the valence estimation accuracy. In the reactive emotion expression module, the system's emotion category and level are predicted using the parameters estimated in the recognition module, based on distributions inferred from human-human dialog data. Subjective evaluation results show that the proposed method is effective for expressing human-like reactive emotion.","PeriodicalId":142091,"journal":{"name":"2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/APSIPA.2017.8282243","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15
Abstract
In order to achieve rapport in human-robot interaction, it is important to express a reactive emotion that matches with the user's mental state. This paper addresses an emotion recognition method which combines prosody and sentiment analysis for the system to properly express reactive emotion. In the user emotion recognition module, valence estimation from prosodic features is combined with sentiment analysis of text information. Combining the two information sources significantly improved the valence estimation accuracy. In the reactive emotion expression module, the system's emotion category and level are predicted using the parameters estimated in the recognition module, based on distributions inferred from human-human dialog data. Subjective evaluation results show that the proposed method is effective for expressing human-like reactive emotion.