Updating expectencies about audiovisual associations in speech

Tim Paris, Jeesun Kim, C. Davis
{"title":"Updating expectencies about audiovisual associations in speech","authors":"Tim Paris, Jeesun Kim, C. Davis","doi":"10.1163/187847612X647946","DOIUrl":null,"url":null,"abstract":"The processing of multisensory information depends on the learned association between sensory cues. In the case of speech there is a well-learned association between the movements of the lips and the subsequent sound. That is, particular lip and mouth movements reliably lead to a specific sound. EEG and MEG studies that have investigated the differences between this ‘congruent’ AV association and other ‘incongruent’ associations have commonly reported ERP differences from 350 ms after sound onset. Using a 256 active electrode EEG system, we tested whether this ‘congruency effect’ would be reduced in the context where most of the trials had an altered audiovisual association (auditory speech paired with mismatched visual lip movements). Participants were presented stimuli over 2 sessions: in one session only 15% were incongruent trials; in the other session, 85% were incongruent trials. We found a congruency effect, showing differences in ERP between congruent and incongruent speech between 350 and 500 ms. Importantly, this effect was reduced within the context of mostly incongruent trials. This reduction in the congruency effect indicates that the way in which AV speech is processed depends on the context it is viewed in. Furthermore, this result suggests that exposure to novel sensory relationships leads to updated expectations regarding the relationship between auditory and visual speech cues.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"120 1","pages":"164-164"},"PeriodicalIF":0.0000,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X647946","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Seeing and Perceiving","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1163/187847612X647946","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The processing of multisensory information depends on the learned association between sensory cues. In the case of speech there is a well-learned association between the movements of the lips and the subsequent sound. That is, particular lip and mouth movements reliably lead to a specific sound. EEG and MEG studies that have investigated the differences between this ‘congruent’ AV association and other ‘incongruent’ associations have commonly reported ERP differences from 350 ms after sound onset. Using a 256 active electrode EEG system, we tested whether this ‘congruency effect’ would be reduced in the context where most of the trials had an altered audiovisual association (auditory speech paired with mismatched visual lip movements). Participants were presented stimuli over 2 sessions: in one session only 15% were incongruent trials; in the other session, 85% were incongruent trials. We found a congruency effect, showing differences in ERP between congruent and incongruent speech between 350 and 500 ms. Importantly, this effect was reduced within the context of mostly incongruent trials. This reduction in the congruency effect indicates that the way in which AV speech is processed depends on the context it is viewed in. Furthermore, this result suggests that exposure to novel sensory relationships leads to updated expectations regarding the relationship between auditory and visual speech cues.
语音中视听关联期望的更新
多感觉信息的处理依赖于感觉线索之间的习得关联。在说话的情况下,嘴唇的动作和随后发出的声音之间有一种后天习得的联系。也就是说,特定的嘴唇和嘴部运动可靠地发出特定的声音。脑电图和脑磁图研究已经调查了这种“一致的”AV关联和其他“不一致的”关联之间的差异,通常报告了声音发作后350毫秒的ERP差异。使用256个有源电极脑电图系统,我们测试了这种“一致性效应”是否会在大多数试验具有改变的视听关联(听觉言语与不匹配的视觉嘴唇运动配对)的情况下减少。参与者在两个阶段被呈现刺激:在一个阶段只有15%是不一致的试验;在另一个阶段,85%是不一致的试验。我们发现了一致性效应,在350 ms和500 ms之间显示了一致性和不一致性语音之间的ERP差异。重要的是,在大多数不一致的试验中,这种影响被降低了。这种一致性效应的减弱表明,AV语音的处理方式取决于它所处的语境。此外,这一结果表明,接触新的感官关系会导致对听觉和视觉语言线索之间关系的更新预期。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Seeing and Perceiving
Seeing and Perceiving BIOPHYSICS-PSYCHOLOGY
自引率
0.00%
发文量
0
审稿时长
>12 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信