{"title":"Sensory Integration and Cognitive Motor Interference of Postural Control in Older Adults with Mild Cognitive Impairment: A Literature Review.","authors":"Genevieve B Smith, Richard Magill, Ashwini K Rao","doi":"10.1163/22134808-bja10191","DOIUrl":"https://doi.org/10.1163/22134808-bja10191","url":null,"abstract":"<p><p>Older adults with mild cognitive impairment (MCI) have a higher risk for falls than cognitively healthy older adults. Risk factors for the increased risk for falls includes impaired postural control. The purpose of this literature review is to explore the literature examining the sensory strategies and cognitive processes of postural control in older adults with MCI. We included research papers examining sensory manipulations and cognitive motor interference on postural outcomes during static balance conditions between older adults with and without MCI utilizing instrumented assessments. Results of this review are mixed. While some research suggests older adults with MCI demonstrate differences in postural-sway outcome measures compared to older adults without MCI during static balance conditions with different sensory manipulations, this is not universal among all studies included in this review. Additional research suggests older adults with MCI demonstrate differences in postural-sway outcome measures for single- and dual-task motor performance and differences in single- and dual-task cognitive performance, but some studies included in this review found no group differences. While there is some evidence suggesting older adults with MCI demonstrate differences in postural-sway measures with sensory manipulations and cognitive motor interference on postural outcomes during static balance conditions compared to cognitively healthy older adults, results of this review are inconclusive. Future research is needed to understand postural control strategies with altered sensory input and cognitive motor interference in older adults with MCI during static balance conditions, which can enhance the effectiveness of fall prevention programs and reduce the future risk of falls.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-21"},"PeriodicalIF":1.5,"publicationDate":"2026-04-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147823329","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sweet Is Soft, Bitter Is Rough: Evidence for a Shared Emotional Dimension Across Taste and Touch.","authors":"Jongwan Kim","doi":"10.1163/22134808-bja10189","DOIUrl":"https://doi.org/10.1163/22134808-bja10189","url":null,"abstract":"<p><p>Although taste and touch are processed through separate sensory channels, both elicit strong emotional responses that may share an underlying affective organization. This study investigated whether gustatory and tactile experiences are structured within a common emotional coordinate system defined by valence and arousal. To address this question, we re-analyzed two previously collected datasets, one for taste-based emotion ratings and one for tactile emotion ratings. In two independent datasets, 30 participants evaluated four taste stimuli (sweet, sour, bitter, and salty) on ten affective adjectives, while a separate group of 27 participants rated four tactile textures (rough-hard, rough-soft, smooth-hard, and smooth-soft) on twenty affective adjectives. Principal component analyses of the group-averaged emotion matrices revealed comparable two-dimensional affective spaces in both modalities, corresponding to pleasant-unpleasant and high-low arousal dimensions. Procrustes alignment and consensus mapping demonstrated strong geometric correspondence between taste and touch, with sweet aligning most closely with smooth-soft, sour with smooth-hard, salty with rough-hard, and bitter with rough-soft. Quantitatively, the Procrustes distance was low ( d = 0.094), the mean cosine similarity across pairs was 0.70, and the Mantel test showed a significant correlation ( r = 0.90, p = 0.04), confirming robust cross-modal affective similarity. A combined multidimensional scaling solution further integrated both modalities into a unified affective map with negligible stress. Together, these results provide converging evidence that affective responses to taste and touch follow a shared low-dimensional structure, supporting the idea that affective responses across modalities can be represented within a common evaluative coordinate space. This finding highlights emotion as a unifying representational framework that bridges distinct perceptual domains and informs multisensory theories of affect.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-23"},"PeriodicalIF":1.5,"publicationDate":"2026-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147619227","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Merle T Fairhurst, Eoin Travers, Isabelle Ripp, Ophelia Deroy, Vincent Hayward
{"title":"When in Doubt, Touch Is More Convincing Than Vision.","authors":"Merle T Fairhurst, Eoin Travers, Isabelle Ripp, Ophelia Deroy, Vincent Hayward","doi":"10.1163/22134808-bja10188","DOIUrl":"https://doi.org/10.1163/22134808-bja10188","url":null,"abstract":"<p><p>We sometimes feel compelled to touch objects after having looked at them. Is this because touch gives us more accurate information, or is it simply because we tend to trust what we feel more than what we see? To disentangle these possibilities, participants were subjected to a variant of the 'Vertical-Horizontal' illusion where the length of a vertical bar is consistently overestimated relative to the length of a horizontal bar. Our variant focuses on points of subjective equality where the bars almost look the same, and possesses two key characteristics: it creates ambiguity as stimuli can be subjectively similar yet objectively distinct and operates both in vision and touch. In a forced resampling bimodal experiment, participants inspected the stimuli using both vision and touch, judged which bar appeared longer, and rated their confidence. They then re-examined the same objects using either vision or touch, as instructed, and provided a second judgement with another confidence rating. Resampling did not significantly improve the overall accuracy, but participants were more likely to change their mind when using touch. In a second, free resampling bimodal experiment, participants chose their preferred modality for reinspection. Neither modality led to more accurate responses, but participants chose to resample by touch increasingly more frequently when the vertical and horizontal bars' lengths were closer in perceived size, leading to higher ambiguity. Additionally, they were more likely to change their mind under these ambiguous conditions. These findings suggest a selective bias toward touch in case of ambiguity, even when it offers no objective advantage over vision.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"39 3-5","pages":"399-415"},"PeriodicalIF":1.5,"publicationDate":"2026-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147596231","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuqing Che, Kristina Eva Manda, Indi Parmar, Chris Ashwin, Karin Petrini
{"title":"Overlapping Effects of Music Training on Multisensory and Emotion Processing: A Systematic Review.","authors":"Yuqing Che, Kristina Eva Manda, Indi Parmar, Chris Ashwin, Karin Petrini","doi":"10.1163/22134808-bja10186","DOIUrl":"https://doi.org/10.1163/22134808-bja10186","url":null,"abstract":"<p><p>Evidence suggests musicians have enhanced audiovisual and emotion recognition abilities. However, these two lines of research have generally been separated in the literature, despite these processes being similarly altered in certain populations (e.g., autism, schizophrenia). The current systematic review presents a comprehensive picture of the effect of music training on behavioural and neural changes in audiovisual and emotion recognition processes, to better understand where they might overlap or share any similarities. It additionally assessed the impact of different music training factors (i.e., training onset, length, type of musical instrument and the type of research task). Finally, this review aimed to produce a clearer understanding of whether the effects of music training extend beyond the music and sound domain. Following PRISMA guidelines, 64 papers were identified, of which 41 examined audiovisual processing, 20 investigated emotion processing, and three examined both processes. The available evidence revealed a consistent musician's advantage for some audiovisual processes (e.g., audiovisual temporal correspondence), with some evidence that this advantage extended beyond the music domain. Consistent musician's advantages were also found for processing basic emotions from speech prosody, with some evidence that this extended to complex emotions. A shared brain network for these effects was identified comprising the anterior cingulate cortex and superior frontal gyrus. Together, our findings suggest that audiovisual and emotion recognition processes share a number of similarities in how music training can shape them. Further research should directly explore the combined effect of music training on multisensory and emotion recognition to inform effective music interventions aimed at enhancing these processes.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-53"},"PeriodicalIF":1.5,"publicationDate":"2026-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147596216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Max Teaford, Shyla S Khan, Tanner Greene, Sandra Pullam, Jacqueline Lawshe, Jackson Shaheen
{"title":"Audiovisual Contributions to the Occurrence of Roll Vection in Virtual Reality.","authors":"Max Teaford, Shyla S Khan, Tanner Greene, Sandra Pullam, Jacqueline Lawshe, Jackson Shaheen","doi":"10.1163/22134808-bja10187","DOIUrl":"https://doi.org/10.1163/22134808-bja10187","url":null,"abstract":"<p><p>Vection (i.e., the experience of self-motion in the absence of actual motion), has traditionally been considered a visual phenomenon. However, recent work on yaw vection (i.e., illusory rotations around the vertical axis), suggests that auditory cues may contribute to vection as well; specifically, if the sounds are similar to those heard in the real world and come from a sound source that typically remains in the same spatial location. In the present study, we sought to determine if roll vection (i.e., illusory rotations around the longitudinal axis) is also enhanced by auditory cues. To test this possibility, we had 44 participants experience three different combinations of sensory cues (audiovisual, visual-only and auditory-only), which were presented via virtual reality a total of three times each. We found that participants experienced vection sooner and more convincingly the first two times they were exposed to the audiovisual condition relative to the visual-only and auditory-only conditions. However, there was no difference between the audiovisual and visual-only condition the third time they experienced them. Regardless of the number of times the participants experienced each condition, the audiovisual and visual-only conditions were always characterized by higher convincingness ratings and lower onset latencies than the auditory-only condition. In tandem, these results suggest that audiovisual stimuli can indeed elicit roll vection that starts earlier and is more convincing than unisensory variants of the stimuli (i.e., visual-only and auditory-only). However, the benefit of including auditory cues diminishes over the course of repetitions suggesting that the brain may down-weight auditory cues. Future studies are needed to better characterize the mechanisms underlying this finding and other factors which may impact this effect (e.g., sound type and field-of-view size).</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-22"},"PeriodicalIF":1.5,"publicationDate":"2026-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146214991","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Life and Works of Vincent Hayward: An Introduction.","authors":"Alessandro Farnè, Luke E Miller","doi":"10.1163/22134808-20250001","DOIUrl":"10.1163/22134808-20250001","url":null,"abstract":"<p><p>In this Introduction, we have the pleasure of introducing the twelve articles of this Special Issue of Multisensory Research celebrating the life and works of Vincent Hayward. Vincent was a prolific scientist, collaborator, and colleague. As you will see by the variety of contributed papers, his influence spanned several fields and topics; from engineering to neurophysiology; from skin mechanics to olfactory metacognition. We and many others had the pleasure of knowing and working with Vincent. His boundless curiosity shines through in the papers of this Special Issue, and hopefully in our Introduction as well. Though gone, he is not forgotten; His legacy and influence lives on in the hearts and minds of colleagues studying the (neuro)science of body perception.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"203-209"},"PeriodicalIF":1.5,"publicationDate":"2026-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146114852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Merve Meral Çetinkaya, Azize Arzu Koçyiğit Köroğlu, Ümit Can Çetinkaya
{"title":"Beyond the Auditory System: Sensory Processing in Decreased Sound Tolerance Disorders.","authors":"Merve Meral Çetinkaya, Azize Arzu Koçyiğit Köroğlu, Ümit Can Çetinkaya","doi":"10.1163/22134808-bja10185","DOIUrl":"https://doi.org/10.1163/22134808-bja10185","url":null,"abstract":"<p><p>This study aims to investigate the correlation between decreased sound tolerance disorders and sensory processing disorder, as well as the sensory systems that are affected when decreased sound tolerance disorders exist. The study included 315 individuals aged 18-35 with normal hearing and no neurological disorders. Participants completed the Decreased Sound Tolerance Disorder Scale (DSTS) and the Adult Sensory Processing Scale (ASPS). According to the DSTS, 278 individuals with decreased sound tolerance disorders were included as the study group, and 37 individuals without decreased sound tolerance were included as a control group. The DSTS includes 33 items assessing symptoms of hyperacusis, phonophobia, and misophonia. The ASPS consists of 48 items across 11 factors that assess the sensitivity of different sensory domains. The distribution of decreased sound tolerance disorders among the participants indicated that 113 participants (35.9%) had all three types of DSTs (triple DST), 85 participants (27.0%) had two types of DSTs (dual DST), 16 participants (5.1%) had hyperacusis only, 14 participants (4.4%) had phonophobia only, 50 participants (15.9%) had misophonia only, and 37 participants (11.7%) had no DSTs (non-DST). A moderately positive correlation was found between the total scores of the ASPS and hyperacusis ( r = 0.260, p < 0.001), and misophonia ( r = 0.348, p < 0.001) scores. Total ASPS scores were higher in those with misophonia compared to those without ( p < 0.05). The results of this study indicate that individuals with decreased sound tolerance may have difficulty in processing not only auditory but also visual, vestibular, proprioceptive, and tactile stimuli.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-20"},"PeriodicalIF":1.5,"publicationDate":"2026-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146107906","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Haptic-Sound Analysis of Materials' Clustering Based on Tool Tip Tapping Exploration.","authors":"Kriti Datta, Amit Bhardwaj, Manish Narwaria","doi":"10.1163/22134808-bja10184","DOIUrl":"10.1163/22134808-bja10184","url":null,"abstract":"<p><p>Tapping surfaces with a tool tip elicits both sound and haptic information. Sound and haptic information is captured by a microphone and an accelerometer attached to the tip, respectively. In relation to the task of distinguishing objects, this paper investigates the following two questions: (1) how do both the signals (sound and acceleration) help us to assess the surface individually? (2) How does the integration of both modalities affect this assessment? We approach the problem of texture assessment as an unsupervised learning problem. For this purpose, perceptual filter banks are designed based on Weber's law of frequency perception for both modalities to extract the corresponding features. Furthermore, we introduce a symmetric KL divergence-based texture similarity metric, which helped us to compare the sound and haptic (acceleration signals) modalities. Based on our similarity metric comparisons, we argue that proximity between object types is preserved across modalities. Finally, using a permutation test-based approach, we demonstrate that the complementarity of sound domain information to the haptic domain varies depending on the type of object.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"417-432"},"PeriodicalIF":1.5,"publicationDate":"2026-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146107854","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Effect of Mask Wearing on Lip-Reading and Audiovisual Speech Perception.","authors":"Yuta Ujiie, Kohske Takahashi","doi":"10.1163/22134808-bja10182","DOIUrl":"https://doi.org/10.1163/22134808-bja10182","url":null,"abstract":"<p><p>Seeing facial speech plays a crucial role in allowing listeners to understand a person's speech. Since the COVID pandemic's outbreak and during the pandemic, the use of facial speech became more difficult because most people routinely wore masks to prevent infection. This study investigates whether and how wearing a mask alters reliance on facial speech for audiovisual speech perception. In this cross-sectional study, we compared the task performance of Japanese young adults in audiovisual speech recognition (i.e., the McGurk effect) and lip-reading between prepandemic and postpandemic groups. For the prepandemic data, we used data from between June and July 2019 from a past study of ours; for the postpandemic data, we collected data from November 2022 to April 2023. The results showed that the amount of McGurk effect (i.e., the amount of reliance on facial speech) in the postpandemic data was comparable to that in prepandemic data. Additionally, there were no significant differences on lip-reading accuracy nor on audiovisual congruent speech recognition. The results imply that, among Japanese young adults, the perceiver's reliance on visual speech as a strategy during audiovisual speech processing did not significantly change between before and after the COVID-19 pandemic.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-12"},"PeriodicalIF":1.5,"publicationDate":"2026-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146042170","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multisensory Tuning of Emotional Face Recognition: A Comparative Study of Olfactory and Gustatory Influences.","authors":"Qingya Yang, Huajing Yang, Ao Wang, Lina Huang, Liuqing Wei, Wenbin Shen, Weiping Yang, Qingguo Ding, Pei Liang","doi":"10.1163/22134808-bja10183","DOIUrl":"https://doi.org/10.1163/22134808-bja10183","url":null,"abstract":"<p><p>Olfaction and gustation are central to affective experience, yet their distinct contributions to social emotion perception remain underexplored compared to vision and audition. This study examined how sweet and sour chemosensory cues modulate emotional judgments of facial expressions, testing whether judgments are systematically biased toward hedonically congruent emotion categories. Two behavioral experiments were conducted. In the olfactory session, participants categorized happy, disgusted, and neutral faces at high (100%) and low (50%) intensity while exposed to either a sweet (melon) or sour (vinegar) odor. In the gustatory session, they performed the same task after ingesting the sweet (sucrose) or sour (citric acid) solution. Accuracy and reaction times were analyzed using generalized linear mixed models. The analyses of accuracy data revealed a significant main effect of intensity and a significant condition × emotion interaction. Happy faces were more accurately identified in the sweet condition, whereas disgusted faces were more accurately identified in the sour condition. Sweet and sour chemosensory cues systematically bias emotional judgments of visual facial expressions. These effects support hedonic congruency predictions and highlight the importance of incorporating chemosensory context into multisensory models of emotion.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-18"},"PeriodicalIF":1.5,"publicationDate":"2026-01-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"146042165","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}