IF 1.5 3区 心理学 Q4 PHYSIOLOGY
Bertrand Beffara, Marina Veyrie, Laura Mauduit, Lara Bardi, Irene Cristofori
{"title":"No evidence for the efficiency of the eye-tracking-based Reading the Mind in the Eyes Test version at detecting differences of mind reading abilities across psychological traits.","authors":"Bertrand Beffara, Marina Veyrie, Laura Mauduit, Lara Bardi, Irene Cristofori","doi":"10.1177/17470218251326569","DOIUrl":null,"url":null,"abstract":"<p><p>The 'Reading the Mind in the Eyes Test' (RMET) is one of the most used tests of theory of mind. Its principle is to match an emotion word to the corresponding face image. The performance at this test has been associated with multiple psychological variables, including personality, loneliness and empathy. Recently, however, the validity of the RMET has been questioned. An alternative version of the test has been tested using eye-tracking in addition to manual responses and was hypothesized to be more sensitive. Here, we put this hypothesis to the test by attempting to reproduce already-assessed correlational results between the performance at the classical RMET and the self-reported personality, loneliness and empathy, now using eye-gaze as an RMET performance index. Despite a marked eye-gaze bias towards the face image corresponding to the target word, the eye-gaze pattern correlated with none of the self-reported psychological variables. This result highlights the interest in using eye-tracking for theory of mind tests, while questioning the robustness of the association between psychological variables and RMET performance, and the validity of the RMET itself.</p>","PeriodicalId":20869,"journal":{"name":"Quarterly Journal of Experimental Psychology","volume":" ","pages":"17470218251326569"},"PeriodicalIF":1.5000,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Quarterly Journal of Experimental Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/17470218251326569","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"PHYSIOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

"读心测试"(RMET)是最常用的心智理论测试之一。其原理是将一个情绪词与相应的面部图像相匹配。该测试的表现与多种心理变量有关,包括人格、孤独感和同理心。然而,最近 RMET 的有效性受到了质疑。除了手动反应之外,我们还使用眼动跟踪(Russell 等人,2021 年)对该测试的另一个版本进行了测试,并假设该版本更加灵敏。在此,我们尝试重现经典 RMET 测试成绩与自我报告的人格、孤独感和同理心之间已经评估过的相关结果,并将眼动作为 RMET 成绩指数,从而对这一假设进行验证。尽管眼球明显偏向于与目标词相对应的脸部图像,但眼球注视模式与自我报告的心理变量均不相关。这一结果突显了使用眼动跟踪进行心理理论测试的意义,同时也对心理变量与 RMET 成绩之间关联的稳健性以及 RMET 本身的有效性提出了质疑。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
No evidence for the efficiency of the eye-tracking-based Reading the Mind in the Eyes Test version at detecting differences of mind reading abilities across psychological traits.

The 'Reading the Mind in the Eyes Test' (RMET) is one of the most used tests of theory of mind. Its principle is to match an emotion word to the corresponding face image. The performance at this test has been associated with multiple psychological variables, including personality, loneliness and empathy. Recently, however, the validity of the RMET has been questioned. An alternative version of the test has been tested using eye-tracking in addition to manual responses and was hypothesized to be more sensitive. Here, we put this hypothesis to the test by attempting to reproduce already-assessed correlational results between the performance at the classical RMET and the self-reported personality, loneliness and empathy, now using eye-gaze as an RMET performance index. Despite a marked eye-gaze bias towards the face image corresponding to the target word, the eye-gaze pattern correlated with none of the self-reported psychological variables. This result highlights the interest in using eye-tracking for theory of mind tests, while questioning the robustness of the association between psychological variables and RMET performance, and the validity of the RMET itself.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.50
自引率
5.90%
发文量
178
审稿时长
3-8 weeks
期刊介绍: Promoting the interests of scientific psychology and its researchers, QJEP, the journal of the Experimental Psychology Society, is a leading journal with a long-standing tradition of publishing cutting-edge research. Several articles have become classic papers in the fields of attention, perception, learning, memory, language, and reasoning. The journal publishes original articles on any topic within the field of experimental psychology (including comparative research). These include substantial experimental reports, review papers, rapid communications (reporting novel techniques or ground breaking results), comments (on articles previously published in QJEP or on issues of general interest to experimental psychologists), and book reviews. Experimental results are welcomed from all relevant techniques, including behavioural testing, brain imaging and computational modelling. QJEP offers a competitive publication time-scale. Accepted Rapid Communications have priority in the publication cycle and usually appear in print within three months. We aim to publish all accepted (but uncorrected) articles online within seven days. Our Latest Articles page offers immediate publication of articles upon reaching their final form. The journal offers an open access option called Open Select, enabling authors to meet funder requirements to make their article free to read online for all in perpetuity. Authors also benefit from a broad and diverse subscription base that delivers the journal contents to a world-wide readership. Together these features ensure that the journal offers authors the opportunity to raise the visibility of their work to a global audience.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信