Haruka Saito, Camille Clarté, Mark Tiede, Lucie Ménard
{"title":"先天性失明成人言语知觉中听觉-空气-触觉整合缺失。","authors":"Haruka Saito, Camille Clarté, Mark Tiede, Lucie Ménard","doi":"10.1121/10.0038638","DOIUrl":null,"url":null,"abstract":"<p><p>Multisensory integration plays a central role in shaping perceptual experience, but how such processes develop in the absence of vision remains unclear. Previous work has shown that sighted individuals integrate tactile air-puff cues with auditory speech signals, even in the absence of linguistic experience linking the two modalities. This study asked whether congenitally blind individuals form similar perceptual associations. Ten blind French-speaking adults completed a forced-choice identification task involving plosive consonants presented along a voice onset time continuum, with and without synchronized air puffs to the skin. Unlike sighted participants in earlier research, the blind group showed no evidence of audio-aerotactile integration. This absence of effect was not attributable to heightened auditory precision or insufficient statistical power. We interpret these findings within the framework of maximum likelihood estimation and discuss possible explanations: (1) that blind individuals flexibly down-weighted the tactile cue as uninformative in context or (2) that they never formed a strong association between the cues due to a lack of early visual experience. These results suggest that visual input may influence how cross-modal associations are formed or weighted in speech perception.</p>","PeriodicalId":17168,"journal":{"name":"Journal of the Acoustical Society of America","volume":"158 2","pages":"1052-1059"},"PeriodicalIF":2.3000,"publicationDate":"2025-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Absence of audio-aerotactile integration in speech perception among congenitally blind adults.\",\"authors\":\"Haruka Saito, Camille Clarté, Mark Tiede, Lucie Ménard\",\"doi\":\"10.1121/10.0038638\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Multisensory integration plays a central role in shaping perceptual experience, but how such processes develop in the absence of vision remains unclear. Previous work has shown that sighted individuals integrate tactile air-puff cues with auditory speech signals, even in the absence of linguistic experience linking the two modalities. This study asked whether congenitally blind individuals form similar perceptual associations. Ten blind French-speaking adults completed a forced-choice identification task involving plosive consonants presented along a voice onset time continuum, with and without synchronized air puffs to the skin. Unlike sighted participants in earlier research, the blind group showed no evidence of audio-aerotactile integration. This absence of effect was not attributable to heightened auditory precision or insufficient statistical power. We interpret these findings within the framework of maximum likelihood estimation and discuss possible explanations: (1) that blind individuals flexibly down-weighted the tactile cue as uninformative in context or (2) that they never formed a strong association between the cues due to a lack of early visual experience. These results suggest that visual input may influence how cross-modal associations are formed or weighted in speech perception.</p>\",\"PeriodicalId\":17168,\"journal\":{\"name\":\"Journal of the Acoustical Society of America\",\"volume\":\"158 2\",\"pages\":\"1052-1059\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2025-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of the Acoustical Society of America\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://doi.org/10.1121/10.0038638\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ACOUSTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Acoustical Society of America","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1121/10.0038638","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ACOUSTICS","Score":null,"Total":0}
Absence of audio-aerotactile integration in speech perception among congenitally blind adults.
Multisensory integration plays a central role in shaping perceptual experience, but how such processes develop in the absence of vision remains unclear. Previous work has shown that sighted individuals integrate tactile air-puff cues with auditory speech signals, even in the absence of linguistic experience linking the two modalities. This study asked whether congenitally blind individuals form similar perceptual associations. Ten blind French-speaking adults completed a forced-choice identification task involving plosive consonants presented along a voice onset time continuum, with and without synchronized air puffs to the skin. Unlike sighted participants in earlier research, the blind group showed no evidence of audio-aerotactile integration. This absence of effect was not attributable to heightened auditory precision or insufficient statistical power. We interpret these findings within the framework of maximum likelihood estimation and discuss possible explanations: (1) that blind individuals flexibly down-weighted the tactile cue as uninformative in context or (2) that they never formed a strong association between the cues due to a lack of early visual experience. These results suggest that visual input may influence how cross-modal associations are formed or weighted in speech perception.
期刊介绍:
Since 1929 The Journal of the Acoustical Society of America has been the leading source of theoretical and experimental research results in the broad interdisciplinary study of sound. Subject coverage includes: linear and nonlinear acoustics; aeroacoustics, underwater sound and acoustical oceanography; ultrasonics and quantum acoustics; architectural and structural acoustics and vibration; speech, music and noise; psychology and physiology of hearing; engineering acoustics, transduction; bioacoustics, animal bioacoustics.