András Bálint , Wilhelm Wimmer , Marco Caversaccio , Christian Rummel , Stefan Weder
{"title":"正常听力成人的脑激活模式:一项使用适应性临床言语理解任务的fNIRS研究。","authors":"András Bálint , Wilhelm Wimmer , Marco Caversaccio , Christian Rummel , Stefan Weder","doi":"10.1016/j.heares.2024.109155","DOIUrl":null,"url":null,"abstract":"<div><h3>Objectives</h3><div>Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures.</div></div><div><h3>Design</h3><div>Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings.</div></div><div><h3>Results</h3><div>In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort.</div></div><div><h3>Conclusions</h3><div>We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.</div></div>","PeriodicalId":12881,"journal":{"name":"Hearing Research","volume":"455 ","pages":"Article 109155"},"PeriodicalIF":2.5000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Brain activation patterns in normal hearing adults: An fNIRS Study using an adapted clinical speech comprehension task\",\"authors\":\"András Bálint , Wilhelm Wimmer , Marco Caversaccio , Christian Rummel , Stefan Weder\",\"doi\":\"10.1016/j.heares.2024.109155\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Objectives</h3><div>Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures.</div></div><div><h3>Design</h3><div>Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings.</div></div><div><h3>Results</h3><div>In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort.</div></div><div><h3>Conclusions</h3><div>We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.</div></div>\",\"PeriodicalId\":12881,\"journal\":{\"name\":\"Hearing Research\",\"volume\":\"455 \",\"pages\":\"Article 109155\"},\"PeriodicalIF\":2.5000,\"publicationDate\":\"2025-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Hearing Research\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0378595524002089\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Hearing Research","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378595524002089","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY","Score":null,"Total":0}
Brain activation patterns in normal hearing adults: An fNIRS Study using an adapted clinical speech comprehension task
Objectives
Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures.
Design
Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings.
Results
In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort.
Conclusions
We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.
期刊介绍:
The aim of the journal is to provide a forum for papers concerned with basic peripheral and central auditory mechanisms. Emphasis is on experimental and clinical studies, but theoretical and methodological papers will also be considered. The journal publishes original research papers, review and mini- review articles, rapid communications, method/protocol and perspective articles.
Papers submitted should deal with auditory anatomy, physiology, psychophysics, imaging, modeling and behavioural studies in animals and humans, as well as hearing aids and cochlear implants. Papers dealing with the vestibular system are also considered for publication. Papers on comparative aspects of hearing and on effects of drugs and environmental contaminants on hearing function will also be considered. Clinical papers will be accepted when they contribute to the understanding of normal and pathological hearing functions.