{"title":"Affective evaluation of a mobile multimodal dialogue system using brain signals","authors":"M. Perakakis, A. Potamianos","doi":"10.1109/SLT.2012.6424195","DOIUrl":null,"url":null,"abstract":"We propose the use of affective metrics such as excitement, frustration and engagement for the evaluation of multimodal dialogue systems. The affective metrics are elicited from the ElectroEncephaloGraphy (EEG) signals using the Emotiv EPOC neuroheadset device. The affective metrics are used in conjunction with traditional evaluation metrics (turn duration, input modality) to investigate the effect of speech recognition errors and modality usage patterns in a multimodal (touch and speech) dialogue form-filling application for the iPhone mobile device. Results show that: (1) engagement is higher for touch input, while excitement and frustration is higher for speech input, and (2) speech recognition errors and associated repairs correspond to specific dynamic patters of excitement and frustration. Use of such physiological channels and their elaborated interpretation is a challenging but also a potentially rewarding direction towards emotional and cognitive assessment of multimodal interaction design.","PeriodicalId":375378,"journal":{"name":"2012 IEEE Spoken Language Technology Workshop (SLT)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE Spoken Language Technology Workshop (SLT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SLT.2012.6424195","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15
Abstract
We propose the use of affective metrics such as excitement, frustration and engagement for the evaluation of multimodal dialogue systems. The affective metrics are elicited from the ElectroEncephaloGraphy (EEG) signals using the Emotiv EPOC neuroheadset device. The affective metrics are used in conjunction with traditional evaluation metrics (turn duration, input modality) to investigate the effect of speech recognition errors and modality usage patterns in a multimodal (touch and speech) dialogue form-filling application for the iPhone mobile device. Results show that: (1) engagement is higher for touch input, while excitement and frustration is higher for speech input, and (2) speech recognition errors and associated repairs correspond to specific dynamic patters of excitement and frustration. Use of such physiological channels and their elaborated interpretation is a challenging but also a potentially rewarding direction towards emotional and cognitive assessment of multimodal interaction design.