Sounds good, doesn't work - an experimental study on the dynamic development of trust in digital systems under inconsistent information about developer reputation and system errors
{"title":"Sounds good, doesn't work - an experimental study on the dynamic development of trust in digital systems under inconsistent information about developer reputation and system errors","authors":"Benedikt Graf","doi":"10.1016/j.chb.2025.108736","DOIUrl":null,"url":null,"abstract":"<div><div>Developing trust in digital systems depends not only on the information about these systems, but also on the information about the system developer, as well as experience with the system. To date, little research has been conducted on how dynamic and changeable trust in digital systems is and how quickly it may be influenced, as recent models of trust development propose. Building on cognitive dissonance theory and expectation disconfirmation theory, this study explores how trust in a digital system (software agent) evolves and changes depending on a) inconsistent information about the system developer and b) the system making errors. This study includes a digital experiment, where 120 participants handled emergency calls in two simulation phases in a control center simulation where a digital system (belief-desire-intention agent) provided support and staffed vehicles with people. We used a 2 × 2 × 2 (IV1: consistent information Yes vs. No; IV2: errors T1 Yes vs. No; IV3: errors T2 Yes vs. No) design with repeated measures on trust at four measurement timepoints. The results show that inconsistent information affects the initial trust before the experience rather than the trust after the simulation phases. Errors consistently showed strong, negative effects on the development of trust across both simulation phases. This study has implications for the adoption of trust development in digital systems over time. Even if digital systems are trusted, this trust does not have to be stable and can change dynamically.</div></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"172 ","pages":"Article 108736"},"PeriodicalIF":8.9000,"publicationDate":"2025-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563225001839","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
Abstract
Developing trust in digital systems depends not only on the information about these systems, but also on the information about the system developer, as well as experience with the system. To date, little research has been conducted on how dynamic and changeable trust in digital systems is and how quickly it may be influenced, as recent models of trust development propose. Building on cognitive dissonance theory and expectation disconfirmation theory, this study explores how trust in a digital system (software agent) evolves and changes depending on a) inconsistent information about the system developer and b) the system making errors. This study includes a digital experiment, where 120 participants handled emergency calls in two simulation phases in a control center simulation where a digital system (belief-desire-intention agent) provided support and staffed vehicles with people. We used a 2 × 2 × 2 (IV1: consistent information Yes vs. No; IV2: errors T1 Yes vs. No; IV3: errors T2 Yes vs. No) design with repeated measures on trust at four measurement timepoints. The results show that inconsistent information affects the initial trust before the experience rather than the trust after the simulation phases. Errors consistently showed strong, negative effects on the development of trust across both simulation phases. This study has implications for the adoption of trust development in digital systems over time. Even if digital systems are trusted, this trust does not have to be stable and can change dynamically.
期刊介绍:
Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.