Anna Borgolte, Ahmad Bransi, Johanna Seifert, Sermin Toto, Gregor R Szycik, Christopher Sinke
{"title":"Audiovisual Simultaneity Judgements in Synaesthesia.","authors":"Anna Borgolte, Ahmad Bransi, Johanna Seifert, Sermin Toto, Gregor R Szycik, Christopher Sinke","doi":"10.1163/22134808-bja10050","DOIUrl":"10.1163/22134808-bja10050","url":null,"abstract":"<p><p>Synaesthesia is a multimodal phenomenon in which the activation of one sensory modality leads to an involuntary additional experience in another sensory modality. To date, normal multisensory processing has hardly been investigated in synaesthetes. In the present study we examine processes of audiovisual separation in synaesthesia by using a simultaneity judgement task. Subjects were asked to indicate whether an acoustic and a visual stimulus occurred simultaneously or not. Stimulus onset asynchronies (SOA) as well as the temporal order of the stimuli were systematically varied. Our results demonstrate that synaesthetes are better in separating auditory and visual events than control subjects, but only when vision leads.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38909098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Katharina Margareta Theresa Pöhlmann, Julia Föcker, Patrick Dickinson, Adrian Parke, Louise O'Hare
{"title":"The Effect of Motion Direction and Eccentricity on Vection, VR Sickness and Head Movements in Virtual Reality.","authors":"Katharina Margareta Theresa Pöhlmann, Julia Föcker, Patrick Dickinson, Adrian Parke, Louise O'Hare","doi":"10.1163/22134808-bja10049","DOIUrl":"10.1163/22134808-bja10049","url":null,"abstract":"<p><p>Virtual Reality (VR) experienced through head-mounted displays often leads to vection, discomfort and sway in the user. This study investigated the effect of motion direction and eccentricity on these three phenomena using optic flow patterns displayed using the Valve Index. Visual motion stimuli were presented in the centre, periphery or far periphery and moved either in depth (back and forth) or laterally (left and right). Overall vection was stronger for motion in depth compared to lateral motion. Additionally, eccentricity primarily affected stimuli moving in depth with stronger vection for more peripherally presented motion patterns compared to more central ones. Motion direction affected the various aspects of VR sickness differently and modulated the effect of eccentricity on VR sickness. For stimuli moving in depth far peripheral presentation caused more discomfort, whereas for lateral motion the central stimuli caused more discomfort. Stimuli moving in depth led to more head movements in the anterior-posterior direction when the entire visual field was stimulated. Observers demonstrated more head movements in the anterior-posterior direction compared to the medio-lateral direction throughout the entire experiment independent of motion direction or eccentricity of the presented moving stimulus. Head movements were elicited on the same plane as the moving stimulus only for stimuli moving in depth covering the entire visual field. A correlation showed a positive relationship between dizziness and vection duration and between general discomfort and sway. Identifying where in the visual field motion presented to an individual causes the least amount of VR sickness without losing vection and presence can guide development for Virtual Reality games, training and treatment programmes.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38895506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effects of Audiovisual Presentations on Visual Localization Errors: One or Several Multisensory Mechanisms?","authors":"Cristina Jordão Nazaré, Armando Mónica Oliveira","doi":"10.1163/22134808-bja10048","DOIUrl":"10.1163/22134808-bja10048","url":null,"abstract":"<p><p>The present study examines the extent to which temporal and spatial properties of sound modulate visual motion processing in spatial localization tasks. Participants were asked to locate the place at which a moving visual target unexpectedly vanished. Across different tasks, accompanying sounds were factorially varied within subjects as to their onset and offset times and/or positions relative to visual motion. Sound onset had no effect on the localization error. Sound offset was shown to modulate the perceived visual offset location, both for temporal and spatial disparities. This modulation did not conform to attraction toward the timing or location of the sounds but, demonstrably in the case of temporal disparities, to bimodal enhancement instead. Favorable indications to a contextual effect of audiovisual presentations on interspersed visual-only trials were also found. The short sound-leading offset asynchrony had equivalent benefits to audiovisual offset synchrony, suggestive of the involvement of early-level mechanisms, constrained by a temporal window, at these conditions. Yet, we tentatively hypothesize that the whole of the results and how they compare with previous studies requires the contribution of additional mechanisms, including learning-detection of auditory-visual associations and cross-sensory spread of endogenous attention.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38895507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Weaker McGurk Effect for Rubin's Vase-Type Speech in People With High Autistic Traits.","authors":"Yuta Ujiie, Kohske Takahashi","doi":"10.1163/22134808-bja10047","DOIUrl":"10.1163/22134808-bja10047","url":null,"abstract":"<p><p>While visual information from facial speech modulates auditory speech perception, it is less influential on audiovisual speech perception among autistic individuals than among typically developed individuals. In this study, we investigated the relationship between autistic traits (Autism-Spectrum Quotient; AQ) and the influence of visual speech on the recognition of Rubin's vase-type speech stimuli with degraded facial speech information. Participants were 31 university students (13 males and 18 females; mean age: 19.2, SD: 1.13 years) who reported normal (or corrected-to-normal) hearing and vision. All participants completed three speech recognition tasks (visual, auditory, and audiovisual stimuli) and the AQ-Japanese version. The results showed that accuracies of speech recognition for visual (i.e., lip-reading) and auditory stimuli were not significantly related to participants' AQ. In contrast, audiovisual speech perception was less susceptible to facial speech perception among individuals with high rather than low autistic traits. The weaker influence of visual information on audiovisual speech perception in autism spectrum disorder (ASD) was robust regardless of the clarity of the visual information, suggesting a difficulty in the process of audiovisual integration rather than in the visual processing of facial speech.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38888435","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Redundant Signals Effect and the Full Body Illusion: not Multisensory, but Unisensory Tactile Stimuli Are Affected by the Illusion.","authors":"Lieke M J Swinkels, Harm Veling, Hein T van Schie","doi":"10.1163/22134808-bja10046","DOIUrl":"10.1163/22134808-bja10046","url":null,"abstract":"<p><p>During a full body illusion (FBI), participants experience a change in self-location towards a body that they see in front of them from a third-person perspective and experience touch to originate from this body. Multisensory integration is thought to underlie this illusion. In the present study we tested the redundant signals effect (RSE) as a new objective measure of the illusion that was designed to directly tap into the multisensory integration underlying the illusion. The illusion was induced by an experimenter who stroked and tapped the participant's shoulder and underarm, while participants perceived the touch on the virtual body in front of them via a head-mounted display. Participants performed a speeded detection task, responding to visual stimuli on the virtual body, to tactile stimuli on the real body and to combined (multisensory) visual and tactile stimuli. Analysis of the RSE with a race model inequality test indicated that multisensory integration took place in both the synchronous and the asynchronous condition. This surprising finding suggests that simultaneous bodily stimuli from different (visual and tactile) modalities will be transiently integrated into a multisensory representation even when no illusion is induced. Furthermore, this finding suggests that the RSE is not a suitable objective measure of body illusions. Interestingly however, responses to the unisensory tactile stimuli in the speeded detection task were found to be slower and had a larger variance in the asynchronous condition than in the synchronous condition. The implications of this finding for the literature on body representations are discussed.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25577661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multiple Spatial Coordinates Influence the Prediction of Tactile Events Facilitated by Approaching Visual Stimuli.","authors":"Tsukasa Kimura","doi":"10.1163/22134808-bja10045","DOIUrl":"10.1163/22134808-bja10045","url":null,"abstract":"<p><p>Interaction with other sensory information is important for prediction of tactile events. Recent studies have reported that the approach of visual information toward the body facilitates prediction of subsequent tactile events. However, the processing of tactile events is influenced by multiple spatial coordinates, and it remains unclear how this approach effect influences tactile events in different spatial coordinates, i.e., spatial reference frames. We investigated the relationship between the prediction of a tactile stimulus via this approach effect and spatial coordinates by comparing ERPs. Participants were asked to place their arms on a desk and required to respond tactile stimuli which were presented to the left (or right) index finger with a high probability (80%) or to the opposite index finger with a low probability (20%). Before the presentation of each tactile stimulus, visual stimuli approached sequentially toward the hand to which the high-probability tactile stimulus was presented. In the uncrossed condition, each hand was placed on the corresponding side. In the crossed condition, each hand was crossed and placed on the opposite side, i.e., left (right) hand placed on the right (left) side. Thus, the spatial location of the tactile stimulus and hand was consistent in the uncrossed condition and inconsistent in the crossed condition. The results showed that N1 amplitudes elicited by high-probability tactile stimuli only decreased in the uncrossed condition. These results suggest that the prediction of a tactile stimulus facilitated by approaching visual information is influenced by multiple spatial coordinates.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25484869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Introduction to the Special Issue on Multisensory Perception in Philosophy.","authors":"Amber Ross, Mohan Matthen","doi":"10.1163/22134808-034001ED","DOIUrl":"https://doi.org/10.1163/22134808-034001ED","url":null,"abstract":"European philosophers of the modern period generally acknowledged that the senses are our primary source of knowledge about the contingent states of the world around us. The question of modality was of secondary interest and was very little discussed in this period. Why? Because these philosophers were atomists about sense-perception, an attitude that makes multisensory perception impossible. Let us explain. Atomists hold that all sense-experience is of ‘ideas’ — a somewhat oversimple, but still useful, way to think of these is as images. All ideas are ultimately composed of simple ideas. Atomists hold, moreover, that the intrinsic nature of a simple (or non-composite) idea is fully given by conscious experience of that idea, and in no other way. For example, burnt sienna is a simple idea because it is not composed of other ideas. Nothing about its intrinsic nature can be known except by experiencing it — a colour-blind individual cannot know what it is. It is, moreover, adequately and completely known when it is experienced; there is nothing more to know about it than is given by visual experience of it (see Note 1). Now, on this account of simple ideas, distinctions among them cannot be analysed. For atomists, inter-modal distinctions, like all other distinctions among ideas, are primitive and based in experience. What, for example, is the difference between burnt sienna and the sound of a trumpet playing middle C? All that can be said is that they are experientially different from one","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25465185","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tyler C Dalal, Anne-Marie Muller, Ryan A Stevenson
{"title":"The Relationship Between Multisensory Temporal Processing and Schizotypal Traits.","authors":"Tyler C Dalal, Anne-Marie Muller, Ryan A Stevenson","doi":"10.1163/22134808-bja10044","DOIUrl":"10.1163/22134808-bja10044","url":null,"abstract":"<p><p>Recent literature has suggested that deficits in sensory processing are associated with schizophrenia (SCZ), and more specifically hallucination severity. The DSM-5's shift towards a dimensional approach to diagnostic criteria has led to SCZ and schizotypal personality disorder (SPD) being classified as schizophrenia spectrum disorders. With SCZ and SPD overlapping in aetiology and symptomatology, such as sensory abnormalities, it is important to investigate whether these deficits commonly reported in SCZ extend to non-clinical expressions of SPD. In this study, we investigated whether levels of SPD traits were related to audiovisual multisensory temporal processing in a non-clinical sample, revealing two novel findings. First, less precise multisensory temporal processing was related to higher overall levels of SPD symptomatology. Second, this relationship was specific to the cognitive-perceptual domain of SPD symptomatology, and more specifically, the Unusual Perceptual Experiences and Odd Beliefs or Magical Thinking symptomatology. The current study provides an initial look at the relationship between multisensory temporal processing and schizotypal traits. Additionally, it builds on the previous literature by suggesting that less precise multisensory temporal processing is not exclusive to SCZ but may also be related to non-clinical expressions of schizotypal traits in the general population.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25465186","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Embodiment and Multisensory Perception of Synchronicity: Biological Features Modulate Visual and Tactile Multisensory Interaction in Simultaneity Judgements.","authors":"Ramiro Joly-Mascheroni, Sonia Abad-Hernando, Bettina Forster, Beatriz Calvo-Merino","doi":"10.1163/22134808-bja10020","DOIUrl":"10.1163/22134808-bja10020","url":null,"abstract":"<p><p>The concept of embodiment has been used in multiple scenarios, but in cognitive neuroscience it normally refers to the comprehension of the role of one's own body in the cognition of everyday situations and the processes involved in that perception. Multisensory research is gradually embracing the concept of embodiment, but the focus has mostly been concentrated upon audiovisual integration. In two experiments, we evaluated how the likelihood of a perceived stimulus to be embodied modulates visuotactile interaction in a Simultaneity Judgement task. Experiment 1 compared the perception of two visual stimuli with and without biological attributes (hands and geometrical shapes) moving towards each other, while tactile stimuli were provided on the palm of the participants' hand. Participants judged whether the meeting point of two periodically-moving visual stimuli was synchronous with the tactile stimulation in their own hands. Results showed that in the hand condition, the Point of Subjective Simultaneity (PSS) was significantly more distant to real synchrony (60 ms after the Stimulus Onset Asynchrony, SOA) than in the geometrical shape condition (45 ms after SOA). In experiment 2, we further explored the impact of biological attributes by comparing performance on two visual biological stimuli (hands and ears), that also vary in their motor and visuotactile properties. Results showed that the PSS was equally distant to real synchrony in both the hands and ears conditions. Overall, findings suggest that embodied visual biological stimuli may modulate visual and tactile multisensory interaction in simultaneity judgements.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25327426","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Aubrieann Schettler, Ian Holstead, John Turri, Michael Barnett-Cowan
{"title":"Visual Self-Motion Feedback Affects the Sense of Self in Virtual Reality.","authors":"Aubrieann Schettler, Ian Holstead, John Turri, Michael Barnett-Cowan","doi":"10.1163/22134808-bja10043","DOIUrl":"10.1163/22134808-bja10043","url":null,"abstract":"<p><p>We assessed how self-motion affects the visual representation of the self. We constructed a novel virtual-reality experiment that systematically varied an avatar's motion and also biological sex. Participants were presented with pairs of avatars that visually represented the participant ('self-avatar'), or another person ('opposite avatar'). Avatar motion either corresponded with the participant's motion, or was decoupled from the participant's motion. The results show that participants identified with (i) 'self-avatars' over 'opposite-avatars', (ii) avatars moving congruently with self-motion over incongruent motion, and importantly (iii) with the 'opposite avatar' over the 'self-avatar' when the opposite avatar's motion was congruent with self-motion. Our results suggest that both self-motion and biological sex are relevant to the body schema and body image and that congruent bottom-up visual feedback of self-motion is particularly important for the sense of self and capable of overriding top-down self-identification factors such as biological sex.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2020-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25327427","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}