Offended by the algorithm: The hidden interpersonal costs of clients seeking AI second opinion

IF 8.9 1区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL
Computers in Human Behavior Pub Date : 2026-06-01 Epub Date: 2026-02-03 DOI:10.1016/j.chb.2026.108934
Gerri Spassova , Mauricio Palmeira
{"title":"Offended by the algorithm: The hidden interpersonal costs of clients seeking AI second opinion","authors":"Gerri Spassova ,&nbsp;Mauricio Palmeira","doi":"10.1016/j.chb.2026.108934","DOIUrl":null,"url":null,"abstract":"<div><div>Rapid advances in artificial intelligence have enabled the rise of AI-enabled advisory tools. While these tools benefit decision-makers, they also introduce new competitive pressures for human advisors whose expertise they may complement or replace. Prior research shows that advisors react negatively when clients approach other advisors, feeling offended and becoming less willing to maintain the relationship. Yet little is known about how advisors respond when the other advisor is an AI system rather than a human. Across four studies, we examine how professionals perceive and react to clients who consult AI-enabled (vs. other human) advisors. We find that learning a client has also sought AI (vs. other human) advice <em>decreases</em> focal advisors' motivation to work with that client. This effect persists even when clients use AI only for background information or as a complementary resource. We propose that advisors view AI as substantially inferior to themselves; thus, being placed in the same category as an AI system feels insulting and signals disrespect, undermining advisors' willingness to engage. We also show that consulting AI may change perceptions of the client, making them appear less competent and warm. Our work contributes to emerging research on the advisor perspective and extends the literature on human responses to AI by shifting attention from AI users to service providers. Practically, the findings suggest that clients’ seemingly innocuous use of AI tools may unintentionally erode their relationships with human advisors.</div></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"179 ","pages":"Article 108934"},"PeriodicalIF":8.9000,"publicationDate":"2026-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563226000312","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/2/3 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

Abstract

Rapid advances in artificial intelligence have enabled the rise of AI-enabled advisory tools. While these tools benefit decision-makers, they also introduce new competitive pressures for human advisors whose expertise they may complement or replace. Prior research shows that advisors react negatively when clients approach other advisors, feeling offended and becoming less willing to maintain the relationship. Yet little is known about how advisors respond when the other advisor is an AI system rather than a human. Across four studies, we examine how professionals perceive and react to clients who consult AI-enabled (vs. other human) advisors. We find that learning a client has also sought AI (vs. other human) advice decreases focal advisors' motivation to work with that client. This effect persists even when clients use AI only for background information or as a complementary resource. We propose that advisors view AI as substantially inferior to themselves; thus, being placed in the same category as an AI system feels insulting and signals disrespect, undermining advisors' willingness to engage. We also show that consulting AI may change perceptions of the client, making them appear less competent and warm. Our work contributes to emerging research on the advisor perspective and extends the literature on human responses to AI by shifting attention from AI users to service providers. Practically, the findings suggest that clients’ seemingly innocuous use of AI tools may unintentionally erode their relationships with human advisors.

Abstract Image

被算法冒犯:客户寻求人工智能第二意见的隐藏人际成本
人工智能的快速发展使得支持人工智能的咨询工具得以兴起。虽然这些工具有利于决策者,但它们也给人类顾问带来了新的竞争压力,这些顾问的专业知识可能会被它们补充或取代。先前的研究表明,当客户接近其他顾问时,顾问的反应是消极的,感觉被冒犯了,不太愿意维持这种关系。然而,当另一个顾问是人工智能系统而不是人类时,我们对顾问的反应知之甚少。在四项研究中,我们研究了专业人士如何看待和应对咨询人工智能(与其他人类)顾问的客户。我们发现,了解到客户也寻求人工智能(相对于其他人类)建议,会降低焦点顾问与该客户合作的动机。即使客户仅将人工智能用于背景信息或作为补充资源,这种影响也会持续存在。我们建议顾问们认为人工智能在本质上不如他们自己;因此,被放在与人工智能系统相同的类别中会让人感觉受到侮辱,并发出不尊重的信号,从而削弱顾问参与的意愿。我们还表明,咨询人工智能可能会改变客户的看法,使他们看起来不那么称职和热情。我们的工作有助于对顾问视角的新兴研究,并通过将注意力从人工智能用户转移到服务提供商,扩展了关于人类对人工智能反应的文献。实际上,研究结果表明,客户对人工智能工具看似无害的使用可能会无意中破坏他们与人类顾问的关系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
19.10
自引率
4.00%
发文量
381
审稿时长
40 days
期刊介绍: Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书