{"title":"将风险纳入人类-自动化信任的重要性","authors":"Rachel E. Stuck, Brianna J. Tomlinson, B. Walker","doi":"10.1080/1463922X.2021.1975170","DOIUrl":null,"url":null,"abstract":"Abstract A key psychological component of interactions in both human-human and human-automation relationships is trust. Although trust has repeatedly been conceptualized as having a component of risk, the role risk plays, as well as what elements of risk impact trust (e.g., perceived risk, risk-taking propensity), has not been clearly explained. Upon reviewing the foundational theories of trust, it is clear that trust is only needed when risk exists or is perceived to exist, in both human-human and human-automation contexts. Within the limited research that has explored human-automation trust and risk, it has been found that the presence of risk and a participant’s perceived situational risk impacts their behavioural trust of the automation. In addition, perceived relational risk has a strong negative relationship with trust. We provide an enhanced model of trust to demonstrate how risk interacts with trust, incorporating these distinct perceived risks, as well as risk-taking propensity. This model identifies the unique interactions of these components with trust based on both the theory reviewed and the studies that have explored some aspects of these relationships. Guidelines are provided for improving the study of human-automation trust via the incorporation of risk.","PeriodicalId":22852,"journal":{"name":"Theoretical Issues in Ergonomics Science","volume":"23 1","pages":"500 - 516"},"PeriodicalIF":1.4000,"publicationDate":"2021-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"The importance of incorporating risk into human-automation trust\",\"authors\":\"Rachel E. Stuck, Brianna J. Tomlinson, B. Walker\",\"doi\":\"10.1080/1463922X.2021.1975170\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract A key psychological component of interactions in both human-human and human-automation relationships is trust. Although trust has repeatedly been conceptualized as having a component of risk, the role risk plays, as well as what elements of risk impact trust (e.g., perceived risk, risk-taking propensity), has not been clearly explained. Upon reviewing the foundational theories of trust, it is clear that trust is only needed when risk exists or is perceived to exist, in both human-human and human-automation contexts. Within the limited research that has explored human-automation trust and risk, it has been found that the presence of risk and a participant’s perceived situational risk impacts their behavioural trust of the automation. In addition, perceived relational risk has a strong negative relationship with trust. We provide an enhanced model of trust to demonstrate how risk interacts with trust, incorporating these distinct perceived risks, as well as risk-taking propensity. This model identifies the unique interactions of these components with trust based on both the theory reviewed and the studies that have explored some aspects of these relationships. Guidelines are provided for improving the study of human-automation trust via the incorporation of risk.\",\"PeriodicalId\":22852,\"journal\":{\"name\":\"Theoretical Issues in Ergonomics Science\",\"volume\":\"23 1\",\"pages\":\"500 - 516\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2021-10-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Theoretical Issues in Ergonomics Science\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/1463922X.2021.1975170\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ERGONOMICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Theoretical Issues in Ergonomics Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/1463922X.2021.1975170","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ERGONOMICS","Score":null,"Total":0}
The importance of incorporating risk into human-automation trust
Abstract A key psychological component of interactions in both human-human and human-automation relationships is trust. Although trust has repeatedly been conceptualized as having a component of risk, the role risk plays, as well as what elements of risk impact trust (e.g., perceived risk, risk-taking propensity), has not been clearly explained. Upon reviewing the foundational theories of trust, it is clear that trust is only needed when risk exists or is perceived to exist, in both human-human and human-automation contexts. Within the limited research that has explored human-automation trust and risk, it has been found that the presence of risk and a participant’s perceived situational risk impacts their behavioural trust of the automation. In addition, perceived relational risk has a strong negative relationship with trust. We provide an enhanced model of trust to demonstrate how risk interacts with trust, incorporating these distinct perceived risks, as well as risk-taking propensity. This model identifies the unique interactions of these components with trust based on both the theory reviewed and the studies that have explored some aspects of these relationships. Guidelines are provided for improving the study of human-automation trust via the incorporation of risk.