高度自闭症患者的鲁宾花瓶型语音麦克格克效应较弱

IF 1.8 4区 心理学 Q3 BIOPHYSICS
Yuta Ujiie, Kohske Takahashi
{"title":"高度自闭症患者的鲁宾花瓶型语音麦克格克效应较弱","authors":"Yuta Ujiie, Kohske Takahashi","doi":"10.1163/22134808-bja10047","DOIUrl":null,"url":null,"abstract":"<p><p>While visual information from facial speech modulates auditory speech perception, it is less influential on audiovisual speech perception among autistic individuals than among typically developed individuals. In this study, we investigated the relationship between autistic traits (Autism-Spectrum Quotient; AQ) and the influence of visual speech on the recognition of Rubin's vase-type speech stimuli with degraded facial speech information. Participants were 31 university students (13 males and 18 females; mean age: 19.2, SD: 1.13 years) who reported normal (or corrected-to-normal) hearing and vision. All participants completed three speech recognition tasks (visual, auditory, and audiovisual stimuli) and the AQ-Japanese version. The results showed that accuracies of speech recognition for visual (i.e., lip-reading) and auditory stimuli were not significantly related to participants' AQ. In contrast, audiovisual speech perception was less susceptible to facial speech perception among individuals with high rather than low autistic traits. The weaker influence of visual information on audiovisual speech perception in autism spectrum disorder (ASD) was robust regardless of the clarity of the visual information, suggesting a difficulty in the process of audiovisual integration rather than in the visual processing of facial speech.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-17"},"PeriodicalIF":1.8000,"publicationDate":"2021-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Weaker McGurk Effect for Rubin's Vase-Type Speech in People With High Autistic Traits.\",\"authors\":\"Yuta Ujiie, Kohske Takahashi\",\"doi\":\"10.1163/22134808-bja10047\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>While visual information from facial speech modulates auditory speech perception, it is less influential on audiovisual speech perception among autistic individuals than among typically developed individuals. In this study, we investigated the relationship between autistic traits (Autism-Spectrum Quotient; AQ) and the influence of visual speech on the recognition of Rubin's vase-type speech stimuli with degraded facial speech information. Participants were 31 university students (13 males and 18 females; mean age: 19.2, SD: 1.13 years) who reported normal (or corrected-to-normal) hearing and vision. All participants completed three speech recognition tasks (visual, auditory, and audiovisual stimuli) and the AQ-Japanese version. The results showed that accuracies of speech recognition for visual (i.e., lip-reading) and auditory stimuli were not significantly related to participants' AQ. In contrast, audiovisual speech perception was less susceptible to facial speech perception among individuals with high rather than low autistic traits. The weaker influence of visual information on audiovisual speech perception in autism spectrum disorder (ASD) was robust regardless of the clarity of the visual information, suggesting a difficulty in the process of audiovisual integration rather than in the visual processing of facial speech.</p>\",\"PeriodicalId\":51298,\"journal\":{\"name\":\"Multisensory Research\",\"volume\":\" \",\"pages\":\"1-17\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2021-04-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Multisensory Research\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1163/22134808-bja10047\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"BIOPHYSICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multisensory Research","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1163/22134808-bja10047","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"BIOPHYSICS","Score":null,"Total":0}
引用次数: 0

摘要

虽然来自面部语音的视觉信息会调节听觉语音感知,但它对自闭症患者视听语音感知的影响却小于典型发育患者。在这项研究中,我们调查了自闭症特质(自闭症谱系商数;AQ)与视觉语音对识别鲁宾花瓶型语音刺激的影响之间的关系。参与者为 31 名大学生(13 男 18 女;平均年龄:19.2 岁,标准差:1.13 岁),听力和视力正常(或矫正为正常)。所有参与者都完成了三项语音识别任务(视觉、听觉和视听刺激)和 AQ 日语版。结果显示,视觉(即唇读)和听觉刺激的语音识别准确率与参与者的 AQ 没有明显关系。相比之下,自闭症特质高的人对视听语音感知的影响比自闭症特质低的人对面部语音感知的影响要小。在自闭症谱系障碍(ASD)患者中,无论视觉信息的清晰度如何,视觉信息对视听言语感知的影响都较弱,这表明视听整合过程中存在困难,而不是面部言语的视觉处理过程中存在困难。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Weaker McGurk Effect for Rubin's Vase-Type Speech in People With High Autistic Traits.

While visual information from facial speech modulates auditory speech perception, it is less influential on audiovisual speech perception among autistic individuals than among typically developed individuals. In this study, we investigated the relationship between autistic traits (Autism-Spectrum Quotient; AQ) and the influence of visual speech on the recognition of Rubin's vase-type speech stimuli with degraded facial speech information. Participants were 31 university students (13 males and 18 females; mean age: 19.2, SD: 1.13 years) who reported normal (or corrected-to-normal) hearing and vision. All participants completed three speech recognition tasks (visual, auditory, and audiovisual stimuli) and the AQ-Japanese version. The results showed that accuracies of speech recognition for visual (i.e., lip-reading) and auditory stimuli were not significantly related to participants' AQ. In contrast, audiovisual speech perception was less susceptible to facial speech perception among individuals with high rather than low autistic traits. The weaker influence of visual information on audiovisual speech perception in autism spectrum disorder (ASD) was robust regardless of the clarity of the visual information, suggesting a difficulty in the process of audiovisual integration rather than in the visual processing of facial speech.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Multisensory Research
Multisensory Research BIOPHYSICS-PSYCHOLOGY
CiteScore
3.50
自引率
12.50%
发文量
15
期刊介绍: Multisensory Research is an interdisciplinary archival journal covering all aspects of multisensory processing including the control of action, cognition and attention. Research using any approach to increase our understanding of multisensory perceptual, behavioural, neural and computational mechanisms is encouraged. Empirical, neurophysiological, psychophysical, brain imaging, clinical, developmental, mathematical and computational analyses are welcome. Research will also be considered covering multisensory applications such as sensory substitution, crossmodal methods for delivering sensory information or multisensory approaches to robotics and engineering. Short communications and technical notes that draw attention to new developments will be included, as will reviews and commentaries on current issues. Special issues dealing with specific topics will be announced from time to time. Multisensory Research is a continuation of Seeing and Perceiving, and of Spatial Vision.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信