{"title":"Effects of Proactive Dialogue Strategies on Human-Computer Trust","authors":"Matthias Kraus, Nicolas Wagner, W. Minker","doi":"10.1145/3340631.3394840","DOIUrl":null,"url":null,"abstract":"Intelligent computer systems aim at providing user-assistance for challenging tasks, like decision-making, planning, or learning. For offering optimal assistance, it is essential for such systems to decide when to be reactive or proactive and how active system behaviour should be designed. Especially, as this decision may greatly influence the user's trust in the system. Therefore, we conducted a mixed-factorial study which examines how different levels of proactivity (none, notification, suggestion, and intervention) as well as timing strategies (fixed-timing and insecurity-based) are trusted by subjects while performing a planning task. The results showed, that proactive system behaviour is perceived trustworthy in insecure situations independent of the timing. However, proactive dialogue showed strong effects on cognition-based trust (system's perceived competence and reliability) depending on task difficulty. Furthermore, fully autonomous system behaviour fails to establish an adequate human-computer trust relationship, in contrast to conservative strategies.","PeriodicalId":417607,"journal":{"name":"Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization","volume":"58 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"30","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3340631.3394840","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 30
Abstract
Intelligent computer systems aim at providing user-assistance for challenging tasks, like decision-making, planning, or learning. For offering optimal assistance, it is essential for such systems to decide when to be reactive or proactive and how active system behaviour should be designed. Especially, as this decision may greatly influence the user's trust in the system. Therefore, we conducted a mixed-factorial study which examines how different levels of proactivity (none, notification, suggestion, and intervention) as well as timing strategies (fixed-timing and insecurity-based) are trusted by subjects while performing a planning task. The results showed, that proactive system behaviour is perceived trustworthy in insecure situations independent of the timing. However, proactive dialogue showed strong effects on cognition-based trust (system's perceived competence and reliability) depending on task difficulty. Furthermore, fully autonomous system behaviour fails to establish an adequate human-computer trust relationship, in contrast to conservative strategies.