Arousal level and exemplar variability of emotional face and voice encoding influence expression-independent identity recognition

IF 1.7 3区 心理学 Q3 PSYCHOLOGY, EXPERIMENTAL
Hanjian Xu, Jorge L. Armony
{"title":"Arousal level and exemplar variability of emotional face and voice encoding influence expression-independent identity recognition","authors":"Hanjian Xu, Jorge L. Armony","doi":"10.1007/s11031-024-10066-1","DOIUrl":null,"url":null,"abstract":"<p>Emotional stimuli and events are better and more easily remembered than neutral ones. However, this advantage appears to come at a cost, namely a decreased accuracy for peripheral, emotion-irrelevant details. There is some evidence, particularly in the visual modality, that this trade-off also applies to emotional expressions, leading to a difficulty in identifying an unfamiliar individual’s identity when presented with an expression different from the one encountered at encoding. On the other hand, past research also suggests that identity recognition memory benefits from exposure to different encoding exemplars, although whether this is also the case for emotional expressions, particularly voices, remains unknown. Here, we directly addressed these questions by conducting a series of voice and face identity memory online studies, using a within-subject old/new recognition test in separate unimodal modules. In the Main Study, half of the identities were encoded with four presentations of one single expression (angry, fearful, happy, or sad; <i>Uni</i> condition) and the other half with one presentation of each emotion (<i>Multi</i> condition); all identities, intermixed with an equal number of new ones, were presented with a neutral expression in a subsequent recognition test. Participants (<i>N</i> = 547, 481 female) were randomly assigned to one of four groups in which a different <i>Uni</i> single emotion was used. Results, using linear mixed models on response choice and drift-diffusion-model parameters, revealed that high-arousal expressions interfered with emotion-independent identity recognition accuracy, but that such deficit could be compensated by presenting the same individual with various expressions (i.e., high exemplar variability). These findings were confirmed by a significant correlation between memory performance and stimulus arousal, across modalities and emotions, and by two follow-up studies (Study 1: <i>N</i> = 172, 150 female; Study 2: <i>N</i> = 174, 154 female), which extended the original observations and ruled out some potential confounding effects. Taken together, the findings reported here expand and refine our current knowledge of the influence of emotion on memory, and highlight the importance of, and interaction between, exemplar variability and emotional arousal in identity recognition memory.</p>","PeriodicalId":48282,"journal":{"name":"Motivation and Emotion","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2024-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Motivation and Emotion","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1007/s11031-024-10066-1","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

Abstract

Emotional stimuli and events are better and more easily remembered than neutral ones. However, this advantage appears to come at a cost, namely a decreased accuracy for peripheral, emotion-irrelevant details. There is some evidence, particularly in the visual modality, that this trade-off also applies to emotional expressions, leading to a difficulty in identifying an unfamiliar individual’s identity when presented with an expression different from the one encountered at encoding. On the other hand, past research also suggests that identity recognition memory benefits from exposure to different encoding exemplars, although whether this is also the case for emotional expressions, particularly voices, remains unknown. Here, we directly addressed these questions by conducting a series of voice and face identity memory online studies, using a within-subject old/new recognition test in separate unimodal modules. In the Main Study, half of the identities were encoded with four presentations of one single expression (angry, fearful, happy, or sad; Uni condition) and the other half with one presentation of each emotion (Multi condition); all identities, intermixed with an equal number of new ones, were presented with a neutral expression in a subsequent recognition test. Participants (N = 547, 481 female) were randomly assigned to one of four groups in which a different Uni single emotion was used. Results, using linear mixed models on response choice and drift-diffusion-model parameters, revealed that high-arousal expressions interfered with emotion-independent identity recognition accuracy, but that such deficit could be compensated by presenting the same individual with various expressions (i.e., high exemplar variability). These findings were confirmed by a significant correlation between memory performance and stimulus arousal, across modalities and emotions, and by two follow-up studies (Study 1: N = 172, 150 female; Study 2: N = 174, 154 female), which extended the original observations and ruled out some potential confounding effects. Taken together, the findings reported here expand and refine our current knowledge of the influence of emotion on memory, and highlight the importance of, and interaction between, exemplar variability and emotional arousal in identity recognition memory.

Abstract Image

情绪面孔和声音编码的唤醒水平和范例变异性影响与表情无关的身份识别
与中性刺激和事件相比,情绪刺激和事件更容易被记住。然而,这种优势似乎也是有代价的,那就是对外围的、与情绪无关的细节的记忆准确性下降。有证据表明,特别是在视觉模式中,这种权衡也适用于情绪表达,当出现与编码时不同的表情时,就很难识别陌生个体的身份。另一方面,过去的研究也表明,身份识别记忆可以从接触不同的编码范例中获益,但这是否也适用于情感表达,尤其是声音,仍然不得而知。在这里,我们直接针对这些问题进行了一系列声音和面部身份记忆的在线研究,在不同的单模态模块中使用了主体内新/旧识别测试。在 "主要研究 "中,一半的身份是用四种单一表情(愤怒、恐惧、快乐或悲伤;统一条件)编码的,另一半的身份是用每种情绪的一种表情编码的(多重条件);在随后的识别测试中,所有的身份都与同等数量的新身份混合在一起,并用中性表情呈现。参与者(547 人,其中 481 人为女性)被随机分配到四组中的一组,每组使用不同的 Uni 单一情绪。使用反应选择和漂移-扩散模型参数的线性混合模型得出的结果显示,高唤醒表情会干扰与情绪无关的身份识别准确性,但这种缺陷可以通过向同一个人展示不同的表情(即高示例变异性)来弥补。这些发现得到了记忆表现与刺激唤醒之间的显著相关性(跨模式和情绪)以及两项后续研究的证实(研究 1:N = 172,150 名女性;研究 2:N = 174,154 名女性),这两项研究扩展了最初的观察结果,并排除了一些潜在的混杂效应。总之,本文报告的研究结果扩展并完善了我们目前关于情绪对记忆影响的知识,并强调了典范可变性和情绪唤醒在身份识别记忆中的重要性及其相互作用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.40
自引率
4.20%
发文量
69
期刊介绍: Motivation and Emotion publishes articles on human motivational and emotional phenomena that make theoretical advances by linking empirical findings to underlying processes. Submissions should focus on key problems in motivation and emotion, and, if using non-human participants, should contribute to theories concerning human behavior.  Articles should be explanatory rather than merely descriptive, providing the data necessary to understand the origins of motivation and emotion, to explicate why, how, and under what conditions motivational and emotional states change, and to document that these processes are important to human functioning.A range of methodological approaches are welcome, with methodological rigor as the key criterion.  Manuscripts that rely exclusively on self-report data are appropriate, but published articles tend to be those that rely on objective measures (e.g., behavioral observations, psychophysiological responses, reaction times, brain activity, and performance or achievement indicators) either singly or combination with self-report data.The journal generally does not publish scale development and validation articles.  However, it is open to articles that focus on the post-validation contribution that a new measure can make.  Scale development and validation work therefore may be submitted if it is used as a necessary prerequisite to follow-up studies that demonstrate the importance of the new scale in making a theoretical advance.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信