{"title":"Relating Sound and Sight in Simulated Environments.","authors":"Kevin Y Tsang, Damien J Mannion","doi":"10.1163/22134808-bja10082","DOIUrl":"https://doi.org/10.1163/22134808-bja10082","url":null,"abstract":"<p><p>The auditory signals at the ear can be affected by components arriving both directly from a sound source and indirectly via environmental reverberation. Previous studies have suggested that the perceptual separation of these contributions can be aided by expectations of likely reverberant qualities. Here, we investigated whether vision can provide information about the auditory properties of physical locations that could also be used to develop such expectations. We presented participants with audiovisual stimuli derived from 10 simulated real-world locations via a head-mounted display (HMD; n = 44) or a web-based ( n = 60) delivery method. On each trial, participants viewed a first-person perspective rendering of a location before hearing a spoken utterance that was convolved with an impulse response that was from a location that was either the same as (congruent) or different to (incongruent) the visually-depicted location. We find that audiovisual congruence was associated with an increase in the probability of participants reporting an audiovisual match of about 0.22 (95% credible interval: [ 0.17 , 0.27 ]), and that participants were more likely to confuse audiovisual pairs as matching if their locations had similar reverberation times. Overall, this study suggests that human perceivers have a capacity to form expectations of reverberation from visual information. Such expectations may be useful for the perceptual challenge of separating sound sources and reverberation from within the signal available at the ear.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 7-8","pages":"589-622"},"PeriodicalIF":1.6,"publicationDate":"2022-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9251263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Something in the Sway: Effects of the Shepard-Risset Glissando on Postural Activity and Vection.","authors":"Rebecca A Mursic, Stephen Palmisano","doi":"10.1163/22134808-bja10081","DOIUrl":"https://doi.org/10.1163/22134808-bja10081","url":null,"abstract":"<p><p>This study investigated claims of disrupted equilibrium when listening to the Shepard-Risset glissando (which creates an auditory illusion of perpetually ascending/descending pitch). During each trial, 23 participants stood quietly on a force plate for 90 s with their eyes either open or closed (30 s pre-sound, 30 s of sound and 30 s post-sound). Their centre of foot pressure (CoP) was continuously recorded during the trial and a verbal measure of illusory self-motion (i.e., vection) was obtained directly afterwards. As expected, vection was stronger during Shepard-Risset glissandi than during white noise or phase-scrambled auditory control stimuli. Individual differences in auditorily evoked postural sway (observed during sound) were also found to predict the strength of this vection. Importantly, the patterns of sway induced by Shepard-Risset glissandi differed significantly from those during our auditory control stimuli - but only in terms of their temporal dynamics. Since significant sound type differences were not seen in terms of sway magnitude, this stresses the importance of investigating the temporal dynamics of sound-posture interactions.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 7-8","pages":"555-587"},"PeriodicalIF":1.6,"publicationDate":"2022-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10704636","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Odor-Induced Taste Enhancement Is Specific to Naturally Occurring Temporal Order and the Respiration Phase.","authors":"Shogo Amano, Takuji Narumi, Tatsu Kobayakawa, Masayoshi Kobayashi, Masahiko Tamura, Yuko Kusakabe, Yuji Wada","doi":"10.1163/22134808-bja10080","DOIUrl":"https://doi.org/10.1163/22134808-bja10080","url":null,"abstract":"<p><p>Interaction between odor and taste information creates flavor perception. There are many possible determinants of the interaction between odor and taste, one of which may be the somatic sensations associated with breathing. We assumed that a smell stimulus accompanied by inhaling or exhaling enhances taste intensity if the order is congruent with natural drinking. To present an olfactory stimulus from the identical location during inhalation and exhalation, we blocked the gap between the tube presenting the olfactory stimulus and the nostril. Participants breathed and ingested the solution according to the instructions on the screen and evaluated the solution's taste intensity. Vanilla odor enhanced the sweet taste in both retronasal and orthonasal conditions when the order of stimuli was congruent with natural drinking, but it did not do so in either condition when they were incongruent. The results suggest that breathing is a determinant of odor-taste interaction. The methods of presenting olfactory stimuli used in this study were compared and discussed in relation to those used in previous studies. Odor-induced taste enhancement depends on the time order of smell with breathing and taste congruency in natural drinking. Taste enhancement was induced by odor in both conditions by minimizing differences in odor presentation between them.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 7-8","pages":"537-554"},"PeriodicalIF":1.6,"publicationDate":"2022-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10698815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Exploring Group Differences in the Crossmodal Correspondences.","authors":"Charles Spence","doi":"10.1163/22134808-bja10079","DOIUrl":"https://doi.org/10.1163/22134808-bja10079","url":null,"abstract":"<p><p>There has been a rapid growth of interest amongst researchers in the cross-modal correspondences in recent years. In part, this has resulted from the emerging realization of the important role that the correspondences can sometimes play in multisensory integration. In turn, this has led to an interest in the nature of any differences between individuals, or rather, between groups of individuals, in the strength and/or consensuality of cross-modal correspondences that may be observed in both neurotypically normal groups cross-culturally, developmentally, and across various special populations (including those who have lost a sense, as well as those with autistic tendencies). The hope is that our emerging understanding of such group differences may one day provide grounds for supporting the reality of the various different types of correspondence that have so far been proposed, namely structural, statistical, semantic, and hedonic (or emotionally mediated).</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 6","pages":"495-536"},"PeriodicalIF":1.6,"publicationDate":"2022-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40427965","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Magdalena Szubielska, Paweł Augustynowicz, Delphine Picard
{"title":"Size and Quality of Drawings Made by Adults Under Visual and Haptic Control.","authors":"Magdalena Szubielska, Paweł Augustynowicz, Delphine Picard","doi":"10.1163/22134808-bja10078","DOIUrl":"https://doi.org/10.1163/22134808-bja10078","url":null,"abstract":"<p><p>The aim of this study was twofold. First, our objective was to test the influence of an object's actual size (size rank) on the drawn size of the depicted object. We tested the canonical size effect (i.e., drawing objects larger in the physical world as larger) in four drawing conditions - two perceptual conditions (blindfolded or sighted) crossed with two materials (paper or special foil for producing embossed drawings). Second, we investigated whether drawing quality (we analysed both the local and global criteria of quality) depends on drawing conditions. We predicted that drawing quality, unlike drawing size, would vary according to drawing conditions - namely, being higher when foil than paper was used for drawing production in the blindfolded condition. We tested these hypotheses with young adults who repeatedly drew eight different familiar objects (differentiated by size in the real world) in four drawing conditions. As expected, drawn size increased linearly with increasing size rank, whatever the drawing condition, thus replicating the canonical size effect and showing that this effect was not dependent on drawing conditions. In line with our hypothesis, in the blindfolded condition drawing quality was better when foil rather than paper was used, suggesting a benefit from haptic feedback on the trace produced. Besides, the quality of drawings produced was still higher in the sighted than the blindfolded condition. In conclusion, canonical size is present under different drawing conditions regardless of whether sight is involved or not, while perceptual control increases drawing quality in adults.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 6","pages":"471-493"},"PeriodicalIF":1.6,"publicationDate":"2022-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40623794","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Crossmodal Correspondence between Music and Ambient Color Is Mediated by Emotion.","authors":"Pia Hauck, Christoph von Castell, Heiko Hecht","doi":"10.1163/22134808-bja10077","DOIUrl":"https://doi.org/10.1163/22134808-bja10077","url":null,"abstract":"<p><p>The quality of a concert hall primarily depends on its acoustics. But does visual input also have an impact on musical enjoyment? Does the color of ambient lighting modulate the perceived music quality? And are certain colors perceived to fit better than others with a given music piece? To address these questions, we performed three within-subjects experiments. We carried out two pretests to select four music pieces differing in tonality and genre, and 14 lighting conditions of varying hue, brightness, and saturation. In the main experiment, we applied a fully crossed repeated-measures design. Under each of the four lighting conditions, participants rated the musical variables 'Harmonic', 'Powerful', 'Gloomy', 'Lively' and overall liking of the music pieces, as well as the perceived fit of music and lighting. Subsequently, participants evaluated music and lighting separately by rating the same variables as before, as well as their emotional impact (valence, arousal, dominance). We found that music and lighting being similarly rated in terms of valence and arousal in the unimodal conditions were judged to match better when presented together. Accordingly, tonal (atonal) music was rated to fit better with weakly saturated (highly saturated) colors. Moreover, some characteristics of the lighting were carried over to music. That is, just as red lighting was rated as more powerful than green and blue lighting, music was evaluated to be more powerful under red compared to green and blue lighting. We conclude that listening to music is a multisensory process enriched by impressions from the visual domain.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 5","pages":"407-446"},"PeriodicalIF":1.6,"publicationDate":"2022-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40623795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Investigating the Crossmodal Influence of Odour on the Visual Perception of Facial Attractiveness and Age.","authors":"Yi-Chuan Chen, Charles Spence","doi":"10.1163/22134808-bja10076","DOIUrl":"https://doi.org/10.1163/22134808-bja10076","url":null,"abstract":"<p><p>We report two experiments designed to investigate whether the presentation of a range of pleasant fragrances, containing both floral and fruity notes, would modulate people's judgements of the facial attractiveness (Experiment 1) and age (Experiment 2) of a selection of typical female faces varying in age in the range 20-69 years. In Experiment 1, male participants rated the female faces as less attractive when presented with an unpleasant fragrance compared to clean air. The rated attractiveness of the female faces was lower when the participants rated the unpleasant odour as having a lower attractiveness and pleasantness, and a higher intensity. In Experiment 2, both male and female participants rated the age of female faces while presented with one of four pleasant fragrances or clean air as a control. Only the female participants demonstrated a crossmodal effect, with the pleasant fragrances inducing an older rating for female faces in the 40-49-years-old age range, whereas a younger rating was documented for female faces in the 60-69-years-old age range. Taken together, these results are consistent with the view that while the valence of fragrance (pleasant versus unpleasant) exerts a robust crossmodal influence over judgements of facial attractiveness, the effects of pleasant fragrance on judgements of a person's age appear to be less reliable. One possible explanation for the differing effect of scent in the two cases relates to the fact that attractiveness judgements are more subjective, hedonic, and/or intuitive than age ratings which are more objective, cognitive-mediated, and/or analytic in nature.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 6","pages":"447-469"},"PeriodicalIF":1.6,"publicationDate":"2022-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40623796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xiaofang Sun, Pin-Hsuan Chen, Pei-Luen Patrick Rau
{"title":"Do Congruent Auditory Stimuli Facilitate Visual Search in Dynamic Environments? An Experimental Study Based on Multisensory Interaction.","authors":"Xiaofang Sun, Pin-Hsuan Chen, Pei-Luen Patrick Rau","doi":"10.1163/22134808-bja10075","DOIUrl":"10.1163/22134808-bja10075","url":null,"abstract":"<p><p>The purpose of this study was to investigate the cue congruency effect of auditory stimuli during visual search in dynamic environments. Twenty-eight participants were recruited to conduct a visual search experiment. The experiment applied auditory stimuli to understand whether they could facilitate visual search in different types of background. Additionally, target location and target orientation were manipulated to clarify their influences on visual search. Target location was related to horizontal visual search and target orientation was associated with visual search for an inverted target. The results regarding dynamic backgrounds reported that target-congruent auditory stimuli could speed up the visual search time. In addition, the cue congruency effect of auditory stimuli was critical for the center of the visual display but declined for the edge, indicating the inhibition of horizontal visual search behavior. Moreover, few improvements accompanying auditory stimuli were provided for the visual detection of non-inverted and inverted targets. The findings of this study suggested developing multisensory interaction with head-mounted displays, such as augmented reality glasses, in real life.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"1 1","pages":"1-15"},"PeriodicalIF":1.6,"publicationDate":"2022-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46029081","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Monica Gori, Sara Price, Fiona N Newell, Nadia Berthouze, Gualtiero Volpe
{"title":"Multisensory Perception and Learning: Linking Pedagogy, Psychophysics, and Human-Computer Interaction.","authors":"Monica Gori, Sara Price, Fiona N Newell, Nadia Berthouze, Gualtiero Volpe","doi":"10.1163/22134808-bja10072","DOIUrl":"https://doi.org/10.1163/22134808-bja10072","url":null,"abstract":"<p><p>In this review, we discuss how specific sensory channels can mediate the learning of properties of the environment. In recent years, schools have increasingly been using multisensory technology for teaching. However, it still needs to be sufficiently grounded in neuroscientific and pedagogical evidence. Researchers have recently renewed understanding around the role of communication between sensory modalities during development. In the current review, we outline four principles that will aid technological development based on theoretical models of multisensory development and embodiment to foster in-depth, perceptual, and conceptual learning of mathematics. We also discuss how a multidisciplinary approach offers a unique contribution to development of new practical solutions for learning in school. Scientists, engineers, and pedagogical experts offer their interdisciplinary points of view on this topic. At the end of the review, we present our results, showing that one can use multiple sensory inputs and sensorimotor associations in multisensory technology to improve the discrimination of angles, but also possibly for educational purposes. Finally, we present an application, the 'RobotAngle' developed for primary (i.e., elementary) school children, which uses sounds and body movements to learn about angles.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 4","pages":"335-366"},"PeriodicalIF":1.6,"publicationDate":"2022-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9383652","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Elyse Letts, Aysha Basharat, Michael Barnett-Cowan
{"title":"Evaluating the Effect of Semantic Congruency and Valence on Multisensory Integration.","authors":"Elyse Letts, Aysha Basharat, Michael Barnett-Cowan","doi":"10.1163/22134808-bja10073","DOIUrl":"https://doi.org/10.1163/22134808-bja10073","url":null,"abstract":"<p><p>Previous studies have found that semantics, the higher-level meaning of stimuli, can impact multisensory integration; however, less is known about the effect of valence, an affective response to stimuli. This study investigated the effects of both semantic congruency and valence of non-speech audiovisual stimuli on multisensory integration via response time (RT) and temporal-order judgement (TOJ) tasks [assessing processing speed (RT), Point of Subjective Simultaneity (PSS), and time window when multisensory stimuli are likely to be perceived as simultaneous (temporal binding window; TBW)]. Through an online study with 40 participants (mean age: 26.25 years; females = 17), we found that both congruence and valence had a significant main effect on RT (congruency and positive valence decrease RT) and an interaction effect (congruent/positive valence condition being significantly faster than all others). For TOJ, there was a significant main effect of valence and a significant interaction effect where positive valence (compared to negative valence) and the congruent/positive condition (compared to all other conditions) required visual stimuli to be presented significantly earlier than auditory stimuli to be perceived as simultaneous. A subsequent analysis showed a positive correlation between TBW width and RT (as TBW widens, RT increases) for the categories that were furthest from true simultaneity in their PSS (Congruent/Positive and Incongruent/Negative). This study provides new evidence that supports previous research on semantic congruency and presents a novel incorporation of valence into behavioural responses.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 4","pages":"309-334"},"PeriodicalIF":1.6,"publicationDate":"2022-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9378768","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}