{"title":"Audio-Visual Interference During Motion Discrimination in Starlings.","authors":"Gesa Feenders, Georg M Klump","doi":"10.1163/22134808-bja10092","DOIUrl":null,"url":null,"abstract":"<p><p>Motion discrimination is essential for animals to avoid collisions, to escape from predators, to catch prey or to communicate. Although most terrestrial vertebrates can benefit by combining concurrent stimuli from sound and vision to obtain a most salient percept of the moving object, there is little research on the mechanisms involved in such cross-modal motion discrimination. We used European starlings as a model with a well-studied visual and auditory system. In a behavioural motion discrimination task with visual and acoustic stimuli, we investigated the effects of cross-modal interference and attentional processes. Our results showed an impairment of motion discrimination when the visual and acoustic stimuli moved in opposite directions as compared to congruent motion direction. By presenting an acoustic stimulus of very short duration, thus lacking directional motion information, an additional alerting effect of the acoustic stimulus became evident. Finally, we show that a temporally leading acoustic stimulus did not improve the response behaviour compared to the synchronous presentation of the stimuli as would have been expected in case of major alerting effects. This further supports the importance of congruency and synchronicity in the current test paradigm with a minor role of attentional processes elicited by the acoustic stimulus. Together, our data clearly show cross-modal interference effects in an audio-visual motion discrimination paradigm when carefully selecting real-life stimuli under parameter conditions that meet the known criteria for cross-modal binding.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 2","pages":"181-212"},"PeriodicalIF":1.8000,"publicationDate":"2023-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multisensory Research","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1163/22134808-bja10092","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"BIOPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
Motion discrimination is essential for animals to avoid collisions, to escape from predators, to catch prey or to communicate. Although most terrestrial vertebrates can benefit by combining concurrent stimuli from sound and vision to obtain a most salient percept of the moving object, there is little research on the mechanisms involved in such cross-modal motion discrimination. We used European starlings as a model with a well-studied visual and auditory system. In a behavioural motion discrimination task with visual and acoustic stimuli, we investigated the effects of cross-modal interference and attentional processes. Our results showed an impairment of motion discrimination when the visual and acoustic stimuli moved in opposite directions as compared to congruent motion direction. By presenting an acoustic stimulus of very short duration, thus lacking directional motion information, an additional alerting effect of the acoustic stimulus became evident. Finally, we show that a temporally leading acoustic stimulus did not improve the response behaviour compared to the synchronous presentation of the stimuli as would have been expected in case of major alerting effects. This further supports the importance of congruency and synchronicity in the current test paradigm with a minor role of attentional processes elicited by the acoustic stimulus. Together, our data clearly show cross-modal interference effects in an audio-visual motion discrimination paradigm when carefully selecting real-life stimuli under parameter conditions that meet the known criteria for cross-modal binding.
期刊介绍:
Multisensory Research is an interdisciplinary archival journal covering all aspects of multisensory processing including the control of action, cognition and attention. Research using any approach to increase our understanding of multisensory perceptual, behavioural, neural and computational mechanisms is encouraged. Empirical, neurophysiological, psychophysical, brain imaging, clinical, developmental, mathematical and computational analyses are welcome. Research will also be considered covering multisensory applications such as sensory substitution, crossmodal methods for delivering sensory information or multisensory approaches to robotics and engineering. Short communications and technical notes that draw attention to new developments will be included, as will reviews and commentaries on current issues. Special issues dealing with specific topics will be announced from time to time. Multisensory Research is a continuation of Seeing and Perceiving, and of Spatial Vision.