Ruoxin Yang , Sisheng Li , Yawei Qi , Jiali Liu , Qinghua He , Haichao Zhao
{"title":"Unveiling users’ algorithm trust: The role of task objectivity, time pressure, and cognitive load","authors":"Ruoxin Yang , Sisheng Li , Yawei Qi , Jiali Liu , Qinghua He , Haichao Zhao","doi":"10.1016/j.chbr.2025.100667","DOIUrl":null,"url":null,"abstract":"<div><div>In the context of artificial intelligence and algorithm-assisted decision-making, understanding dynamics of users' trust in algorithms and its influencing factors is crucial for optimizing human-computer interactions. This study integrates both questionnaires and behavioral experiments to explore the effects of task objectivity, time pressure, and cognitive load on algorithm trust. Study 1 recruited 488 participants to evaluate the perceived objectivity of 24 algorithm-estimated tasks, and revealed a significant positive correlation between perceived task objectivity and algorithm trust. Using a weight estimation task selected by study 1, which was identified as high objectivity and comparable trust in human versus algorithmic decisions, study 2 investigated the algorithm trust of 51 college students under different cognitive loads and time pressures. Results showed that individuals' trust in algorithms increased under time pressure in objective practical choices but significantly decreased in subjective rating afterward. Cognitive load, however, did not significantly impact individual's advice adoption. Study 3 (n = 54) further validated the crucial role of time pressure in shaping algorithm trust and its dynamical changes over time. The study underscored the importance of task objectivity and time pressure on users' algorithm trust, unveiling the cogntive mechamism of task characteristics and contextual factors impacting on the users' alogrithm trust, which provided a theoretical foundation for developers optimize algorithm interaction designs.</div></div>","PeriodicalId":72681,"journal":{"name":"Computers in human behavior reports","volume":"18 ","pages":"Article 100667"},"PeriodicalIF":4.9000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in human behavior reports","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S245195882500082X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
Abstract
In the context of artificial intelligence and algorithm-assisted decision-making, understanding dynamics of users' trust in algorithms and its influencing factors is crucial for optimizing human-computer interactions. This study integrates both questionnaires and behavioral experiments to explore the effects of task objectivity, time pressure, and cognitive load on algorithm trust. Study 1 recruited 488 participants to evaluate the perceived objectivity of 24 algorithm-estimated tasks, and revealed a significant positive correlation between perceived task objectivity and algorithm trust. Using a weight estimation task selected by study 1, which was identified as high objectivity and comparable trust in human versus algorithmic decisions, study 2 investigated the algorithm trust of 51 college students under different cognitive loads and time pressures. Results showed that individuals' trust in algorithms increased under time pressure in objective practical choices but significantly decreased in subjective rating afterward. Cognitive load, however, did not significantly impact individual's advice adoption. Study 3 (n = 54) further validated the crucial role of time pressure in shaping algorithm trust and its dynamical changes over time. The study underscored the importance of task objectivity and time pressure on users' algorithm trust, unveiling the cogntive mechamism of task characteristics and contextual factors impacting on the users' alogrithm trust, which provided a theoretical foundation for developers optimize algorithm interaction designs.