{"title":"Personality Traits Estimation of Participants Based on Multimodal Information in Knowledge-Transfer-type Discussion","authors":"Tessai Hayama, S. Yokoyama","doi":"10.1109/IIAIAAI55812.2022.00014","DOIUrl":null,"url":null,"abstract":"Although an evaluation index of participants’ communication skills in group dialog would be useful for improving each participant’s ability to interact in a group, conventional methods of assessing personality traits are required lots of labor- and time-consuming for participants when conducted frequently, so automation of these methods is preferred. In this study, we developed a method to estimate personality traits of each participant based on multimodal dialog information in knowledge-transfer-type discussions. To achieve it, we created a corpus of knowledge-transfer-type dialogs including participants’ multimodal information and personality assessments of the BigFive and Locus of Control and constructed statistical models to classify high and low degree of the personality traits based on participants’ multimodal information. The evaluation results showed that the model was able to estimate degree of each factor of the BigFive with accuracies in the range of 0.87-1.00, and to estimate degree of Locus of Control with accuracies in the range of 0.87-1.00. The most useful features to estimate personality traits were combinations of acoustic features, head movement features, turn-taking features, and linguistic features. It was also found that the personality traits of participants could be estimated with high accuracy even when using data from the first 5 minutes of the discussion session.","PeriodicalId":156230,"journal":{"name":"2022 12th International Congress on Advanced Applied Informatics (IIAI-AAI)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 12th International Congress on Advanced Applied Informatics (IIAI-AAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IIAIAAI55812.2022.00014","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Although an evaluation index of participants’ communication skills in group dialog would be useful for improving each participant’s ability to interact in a group, conventional methods of assessing personality traits are required lots of labor- and time-consuming for participants when conducted frequently, so automation of these methods is preferred. In this study, we developed a method to estimate personality traits of each participant based on multimodal dialog information in knowledge-transfer-type discussions. To achieve it, we created a corpus of knowledge-transfer-type dialogs including participants’ multimodal information and personality assessments of the BigFive and Locus of Control and constructed statistical models to classify high and low degree of the personality traits based on participants’ multimodal information. The evaluation results showed that the model was able to estimate degree of each factor of the BigFive with accuracies in the range of 0.87-1.00, and to estimate degree of Locus of Control with accuracies in the range of 0.87-1.00. The most useful features to estimate personality traits were combinations of acoustic features, head movement features, turn-taking features, and linguistic features. It was also found that the personality traits of participants could be estimated with high accuracy even when using data from the first 5 minutes of the discussion session.