{"title":"听起来不错,但行不通——在开发者声誉和系统错误信息不一致的情况下,数字系统中信任动态发展的实验研究","authors":"Benedikt Graf","doi":"10.1016/j.chb.2025.108736","DOIUrl":null,"url":null,"abstract":"<div><div>Developing trust in digital systems depends not only on the information about these systems, but also on the information about the system developer, as well as experience with the system. To date, little research has been conducted on how dynamic and changeable trust in digital systems is and how quickly it may be influenced, as recent models of trust development propose. Building on cognitive dissonance theory and expectation disconfirmation theory, this study explores how trust in a digital system (software agent) evolves and changes depending on a) inconsistent information about the system developer and b) the system making errors. This study includes a digital experiment, where 120 participants handled emergency calls in two simulation phases in a control center simulation where a digital system (belief-desire-intention agent) provided support and staffed vehicles with people. We used a 2 × 2 × 2 (IV1: consistent information Yes vs. No; IV2: errors T1 Yes vs. No; IV3: errors T2 Yes vs. No) design with repeated measures on trust at four measurement timepoints. The results show that inconsistent information affects the initial trust before the experience rather than the trust after the simulation phases. Errors consistently showed strong, negative effects on the development of trust across both simulation phases. This study has implications for the adoption of trust development in digital systems over time. Even if digital systems are trusted, this trust does not have to be stable and can change dynamically.</div></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"172 ","pages":"Article 108736"},"PeriodicalIF":8.9000,"publicationDate":"2025-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Sounds good, doesn't work - an experimental study on the dynamic development of trust in digital systems under inconsistent information about developer reputation and system errors\",\"authors\":\"Benedikt Graf\",\"doi\":\"10.1016/j.chb.2025.108736\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Developing trust in digital systems depends not only on the information about these systems, but also on the information about the system developer, as well as experience with the system. To date, little research has been conducted on how dynamic and changeable trust in digital systems is and how quickly it may be influenced, as recent models of trust development propose. Building on cognitive dissonance theory and expectation disconfirmation theory, this study explores how trust in a digital system (software agent) evolves and changes depending on a) inconsistent information about the system developer and b) the system making errors. This study includes a digital experiment, where 120 participants handled emergency calls in two simulation phases in a control center simulation where a digital system (belief-desire-intention agent) provided support and staffed vehicles with people. We used a 2 × 2 × 2 (IV1: consistent information Yes vs. No; IV2: errors T1 Yes vs. No; IV3: errors T2 Yes vs. No) design with repeated measures on trust at four measurement timepoints. The results show that inconsistent information affects the initial trust before the experience rather than the trust after the simulation phases. Errors consistently showed strong, negative effects on the development of trust across both simulation phases. This study has implications for the adoption of trust development in digital systems over time. Even if digital systems are trusted, this trust does not have to be stable and can change dynamically.</div></div>\",\"PeriodicalId\":48471,\"journal\":{\"name\":\"Computers in Human Behavior\",\"volume\":\"172 \",\"pages\":\"Article 108736\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers in Human Behavior\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0747563225001839\",\"RegionNum\":1,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"PSYCHOLOGY, EXPERIMENTAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563225001839","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
摘要
建立对数字系统的信任不仅取决于有关这些系统的信息,还取决于有关系统开发人员的信息,以及对系统的经验。迄今为止,很少有研究像最近的信任发展模型所提出的那样,研究数字系统中的信任是如何动态和变化的,以及它可能受到的影响有多快。基于认知失调理论和期望不确认理论,本研究探讨了数字系统(软件代理)中的信任如何根据a)关于系统开发人员的不一致信息和b)系统制造错误而演变和变化。本研究包括一个数字实验,其中120名参与者在控制中心模拟的两个模拟阶段处理紧急呼叫,其中数字系统(信念-愿望-意图代理)提供支持并为车辆配备人员。我们使用了2 × 2 × 2 (IV1):一致的信息是与否;IV2:错误T1 Yes vs. No;IV3:错误(T2是与否)设计,在四个测量时间点重复测量信任。结果表明,不一致的信息影响体验前的初始信任,而不影响模拟阶段后的信任。在两个模拟阶段,错误始终对信任的发展表现出强烈的负面影响。随着时间的推移,本研究对数字系统中信任发展的采用具有启示意义。即使数字系统是可信的,这种信任也不一定是稳定的,可以动态变化。
Sounds good, doesn't work - an experimental study on the dynamic development of trust in digital systems under inconsistent information about developer reputation and system errors
Developing trust in digital systems depends not only on the information about these systems, but also on the information about the system developer, as well as experience with the system. To date, little research has been conducted on how dynamic and changeable trust in digital systems is and how quickly it may be influenced, as recent models of trust development propose. Building on cognitive dissonance theory and expectation disconfirmation theory, this study explores how trust in a digital system (software agent) evolves and changes depending on a) inconsistent information about the system developer and b) the system making errors. This study includes a digital experiment, where 120 participants handled emergency calls in two simulation phases in a control center simulation where a digital system (belief-desire-intention agent) provided support and staffed vehicles with people. We used a 2 × 2 × 2 (IV1: consistent information Yes vs. No; IV2: errors T1 Yes vs. No; IV3: errors T2 Yes vs. No) design with repeated measures on trust at four measurement timepoints. The results show that inconsistent information affects the initial trust before the experience rather than the trust after the simulation phases. Errors consistently showed strong, negative effects on the development of trust across both simulation phases. This study has implications for the adoption of trust development in digital systems over time. Even if digital systems are trusted, this trust does not have to be stable and can change dynamically.
期刊介绍:
Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.