{"title":"Competing influence of visual speech on auditory neural adaptation","authors":"Marc Sato","doi":"10.1016/j.bandl.2023.105359","DOIUrl":null,"url":null,"abstract":"<div><p>Visual information from a speaker’s face enhances auditory neural processing and speech recognition. To determine whether auditory memory can be influenced by visual speech, the degree of auditory neural adaptation of an auditory syllable preceded by an auditory, visual, or audiovisual syllable was examined using EEG. Consistent with previous findings and additional adaptation of auditory neurons tuned to acoustic features, stronger adaptation of N1, P2 and N2 auditory evoked responses was observed when the auditory syllable was preceded by an auditory compared to a visual syllable. However, although stronger than when preceded by a visual syllable, lower adaptation was observed when the auditory syllable was preceded by an audiovisual compared to an auditory syllable. In addition, longer N1 and P2 latencies were then observed. These results further demonstrate that visual speech acts on auditory memory but suggest competing visual influences in the case of audiovisual stimulation.</p></div>","PeriodicalId":55330,"journal":{"name":"Brain and Language","volume":null,"pages":null},"PeriodicalIF":2.1000,"publicationDate":"2023-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Brain and Language","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0093934X23001384","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Visual information from a speaker’s face enhances auditory neural processing and speech recognition. To determine whether auditory memory can be influenced by visual speech, the degree of auditory neural adaptation of an auditory syllable preceded by an auditory, visual, or audiovisual syllable was examined using EEG. Consistent with previous findings and additional adaptation of auditory neurons tuned to acoustic features, stronger adaptation of N1, P2 and N2 auditory evoked responses was observed when the auditory syllable was preceded by an auditory compared to a visual syllable. However, although stronger than when preceded by a visual syllable, lower adaptation was observed when the auditory syllable was preceded by an audiovisual compared to an auditory syllable. In addition, longer N1 and P2 latencies were then observed. These results further demonstrate that visual speech acts on auditory memory but suggest competing visual influences in the case of audiovisual stimulation.
期刊介绍:
An interdisciplinary journal, Brain and Language publishes articles that elucidate the complex relationships among language, brain, and behavior. The journal covers the large variety of modern techniques in cognitive neuroscience, including functional and structural brain imaging, electrophysiology, cellular and molecular neurobiology, genetics, lesion-based approaches, and computational modeling. All articles must relate to human language and be relevant to the understanding of its neurobiological and neurocognitive bases. Published articles in the journal are expected to have significant theoretical novelty and/or practical implications, and use perspectives and methods from psychology, linguistics, and neuroscience along with brain data and brain measures.