{"title":"Subjective Audibility Modulates the Susceptibility to Sound-Induced Flash Illusion: Effect of Loudness and Auditory Masking.","authors":"Yuki Ito, Hanaka Matsumoto, Kohta I Kobayasi","doi":"10.1163/22134808-bja10109","DOIUrl":"https://doi.org/10.1163/22134808-bja10109","url":null,"abstract":"<p><p>When a brief flash is presented along with two brief sounds, the single flash is often perceived as two flashes. This phenomenon is called a sound-induced flash illusion, in which the auditory sense, with its relatively higher reliability in providing temporal information, modifies the visual perception. Decline of audibility due to hearing impairment is known to make subjects less susceptible to the flash illusion. However, the effect of decline of audibility on susceptibility to the illusion has not been directly investigated in subjects with normal hearing. The present study investigates the relationship between audibility and susceptibility to the illusion by varying the sound pressure level of the stimulus. In the task for reporting the number of auditory stimuli, lowering the sound pressure level caused the rate of perceiving two sounds to decrease on account of forward masking. The occurrence of the illusory flash was reduced as the intensity of the second auditory stimulus decreased, and was significantly correlated with the rate of perceiving the two auditory stimuli. These results suggest that the susceptibility to sound-induced flash illusion depends on the subjective audibility of each sound.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-17"},"PeriodicalIF":1.6,"publicationDate":"2023-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41151212","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Giulia L Poerio, Fatimah Osman, Jennifer Todd, Jasmeen Kaur, Lovell Jones, Flavia Cardini
{"title":"From the Outside in: ASMR Is Characterised by Reduced Interoceptive Accuracy but Higher Sensation Seeking.","authors":"Giulia L Poerio, Fatimah Osman, Jennifer Todd, Jasmeen Kaur, Lovell Jones, Flavia Cardini","doi":"10.1163/22134808-bja10108","DOIUrl":"https://doi.org/10.1163/22134808-bja10108","url":null,"abstract":"<p><p>Autonomous Sensory Meridian Response (ASMR) is a complex sensory-perceptual phenomenon characterised by relaxing and pleasurable scalp-tingling sensations. The ASMR trait is nonuniversal, thought to have developmental origins, and a prevalence rate of 20%. Previous theory and research suggest that trait ASMR may be underlined by atypical multisensory perception from both interoceptive and exteroceptive modalities. In this study, we examined whether ASMR responders differed from nonresponders in interoceptive accuracy and multisensory processing style. Results showed that ASMR responders had lower interoceptive accuracy but a greater tendency towards sensation seeking, especially for tactile, olfactory, and gustatory modalities. Exploratory mediation analyses suggest that sensation-seeking behaviours in trait ASMR could reflect a compensatory mechanism for either deficits in interoceptive accuracy, a tendency to weight exteroceptive signals more strongly, or both. This study provides the foundations for understanding how interoceptive and exteroceptive mechanisms might explain not only the ASMR trait, but also individual differences in the ability to experience complex positive emotions more generally.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-21"},"PeriodicalIF":1.6,"publicationDate":"2023-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41147131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Brayan Rodríguez, Luis H Reyes, Felipe Reinoso-Carvalho
{"title":"Exploring Crossmodal Associations Between Sound and the Chemical Senses: A Systematic Review Including Interactive Visualizations.","authors":"Brayan Rodríguez, Luis H Reyes, Felipe Reinoso-Carvalho","doi":"10.1163/22134808-bja10107","DOIUrl":"10.1163/22134808-bja10107","url":null,"abstract":"<p><p>This is the first systematic review that focuses on the influence of product-intrinsic and extrinsic sounds on the chemical senses involving both food and aroma stimuli. This review has a particular focus on all methodological details (stimuli, experimental design, dependent variables, and data analysis techniques) of 95 experiments, published in 83 publications from 2012 to 2023. 329 distinct crossmodal auditory-chemosensory associations were uncovered across this analysis. What is more, instead of relying solely on static figures and tables, we created a first-of-its-kind comprehensive Power BI dashboard (interactive data visualization tool by Microsoft) on methodologies and significant findings, incorporating various filters and visualizations allowing readers to explore statistics for specific subsets of experiments. We believe that this review can be helpful for researchers and practitioners working in the food and beverage industry and beyond these scopes (e.g., cosmetics). Theoretical and practical implications discussed in this article point to computational approaches that facilitate decision-making regarding multisensory experimental methodology design.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"725-825"},"PeriodicalIF":1.6,"publicationDate":"2023-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41151211","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Melissa Randazzo, Paul J Smith, Ryan Priefer, Deborah R Senzer, Karen Froud
{"title":"The Audiovisual Mismatch Negativity in Predictive and Non-Predictive Speech Stimuli in Older Adults With and Without Hearing Loss.","authors":"Melissa Randazzo, Paul J Smith, Ryan Priefer, Deborah R Senzer, Karen Froud","doi":"10.1163/22134808-bja10106","DOIUrl":"https://doi.org/10.1163/22134808-bja10106","url":null,"abstract":"<p><p>Adults with aging-related hearing loss (ARHL) experience adaptive neural changes to optimize their sensory experiences; for example, enhanced audiovisual (AV) and predictive processing during speech perception. The mismatch negativity (MMN) event-related potential is an index of central auditory processing; however, it has not been explored as an index of AV and predictive processing in adults with ARHL. In a pilot study we examined the AV MMN in two conditions of a passive oddball paradigm - one AV condition in which the visual aspect of the stimulus can predict the auditory percept and one AV control condition in which the visual aspect of the stimulus cannot predict the auditory percept. In adults with ARHL, evoked responses in the AV conditions occurred in the early MMN time window while the older adults with normal hearing showed a later MMN. Findings suggest that adults with ARHL are sensitive to AV incongruity, even when the visual is not predictive of the auditory signal. This suggests that predictive coding for AV speech processing may be heightened in adults with ARHL. This paradigm can be used in future studies to measure treatment related changes, for example via aural rehabilitation, in older adults with ARHL.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-29"},"PeriodicalIF":1.6,"publicationDate":"2023-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10226384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Synergistic Combination of Visual Features in Vision-Taste Crossmodal Correspondences.","authors":"Byron P Lee, Charles Spence","doi":"10.1163/22134808-bja10105","DOIUrl":"https://doi.org/10.1163/22134808-bja10105","url":null,"abstract":"<p><p>There has been a rapid recent growth in academic attempts to summarise, understand, and predict the taste profile matching complex images that incorporate multiple visual design features. While there is now ample research to document the patterns of vision-taste correspondences involving individual visual features (such as colour and shape curvilinearity in isolation), little is known about the taste associations that may be primed when multiple visual features are presented simultaneously. This narrative historical review therefore presents an overview of the research that has examined, or provided insights into, the interaction of graphic elements in taste correspondences involving colour, shape attributes, texture, and other visual features. The empirical evidence is largely in line with the predictions derived from the proposed theories concerning the origins of crossmodal correspondences; the component features of a visual stimulus are observed to contribute substantially to its taste expectations. However, the taste associated with a visual stimulus may sometimes deviate from the taste correspondences primed by its constituent parts. This may occur when a new semantic meaning emerges as multiple features are displayed together. Some visual features may even provide contextual cues for observers, thus altering the gustatory information that they associate with an image. A theoretical framework is constructed to help more intuitively predict and conceptualise the overall influence on taste correspondences when visual features are processed together as a combined image.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-40"},"PeriodicalIF":1.6,"publicationDate":"2023-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10006796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Motion-Binding Property Contributes to Accurate Temporal-Order Perception in Audiovisual Synchrony.","authors":"Jinhwan Kwon, Yoshihiro Miyake","doi":"10.1163/22134808-bja10104","DOIUrl":"10.1163/22134808-bja10104","url":null,"abstract":"<p><p>Temporal perception in multisensory processing is important for an accurate and efficient understanding of the physical world. In general, it is executed in a dynamic environment in our daily lives. In particular, the motion-binding property is important for correctly identifying moving objects in the external environment. However, how this property affects multisensory temporal perception remains unclear. We investigate whether the motion-binding property influences audiovisual temporal integration. The study subjects performed four types of temporal-order judgment (TOJ) task experiments using three types of perception. In Experiment 1, the subjects conducted audiovisual TOJ tasks in the motion-binding condition, between two flashes, and in the simultaneous condition, in which the two flashes are perceived as simultaneous stimuli without motion. In Experiment 2, subjects conducted audiovisual TOJ tasks in the motion-binding condition and the short and long successive interval condition, in which the two stimuli are perceived as successive with no motion. The results revealed that the point of subjective simultaneity (PSS) and the just-noticeable difference (JND) in the motion-binding condition differed significantly from those in the simultaneous and short and long successive interval conditions. Specifically, the PSS in the motion-binding condition was shifted toward a sound-lead stimulus in which the PSS became closer to zero (i.e., physical simultaneity) and the JND became narrower compared to other conditions. This suggests that the motion-binding property contributes to accurate temporal integration in multisensory processing by precisely encoding the temporal order of the physical stimuli.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 6","pages":"557-572"},"PeriodicalIF":1.6,"publicationDate":"2023-08-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10015076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Roxane L Bartoletti, Ambre Denis-Noël, Séraphin Boulvert, Marie Lopez, Sylvane Faure, Xavier Corveleyn
{"title":"Visuo-Tactile Congruence Leads to Stronger Illusion Than Visuo-Proprioceptive Congruence: a Quantitative and Qualitative Approach to Explore the Rubber Hand Illusion.","authors":"Roxane L Bartoletti, Ambre Denis-Noël, Séraphin Boulvert, Marie Lopez, Sylvane Faure, Xavier Corveleyn","doi":"10.1163/22134808-bja10101","DOIUrl":"10.1163/22134808-bja10101","url":null,"abstract":"<p><p>The Rubber Hand Illusion (RHI) arises through multisensory congruence and informative cues from the most relevant sensory channels. Some studies have explored the RHI phenomenon on the fingers, but none of them modulated the congruence of visuo-tactile and visuo-proprioceptive information by changing the posture of the fingers. This study hypothesizes that RHI induction is possible despite a partial visuo-proprioceptive or visuo-tactile incongruence. With quantitative and qualitative measures, we observed that gradual induction of the sense of body ownership depends on the congruence of multisensory information, with an emphasis on visuo-tactile information rather than visuo-proprioceptive signals. Based on the overall measures, the RHI observed went from stronger to weaker with full congruence; visuo-proprioceptive incongruence and visuo-tactile congruence; visuo-proprioceptive congruence and visuo-tactile incongruence; full incongruence. Our results confirm that congruent visual and tactile mapping is important, though not mandatory, to induce a strong sense of ownership. By changing index finger and thumb postures rather than the rotation of the whole hand, our study investigates the contribution of visuo-proprioception and postural congruence in the field of RHI research. The results are in favor of a probabilistic multisensory integration theory and do not resonate with rules and constraints found in internal body models. The RHI could be illustrated as a continuum: the more multisensory information is congruent, the stronger the RHI.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 6","pages":"477-525"},"PeriodicalIF":1.6,"publicationDate":"2023-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10015077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"What Makes the Detection of Movement Different Within the Autistic Traits Spectrum? Evidence From the Audiovisual Depth Paradigm.","authors":"Rachel Poulain, Magali Batty, Céline Cappe","doi":"10.1163/22134808-bja10103","DOIUrl":"10.1163/22134808-bja10103","url":null,"abstract":"<p><p>Atypical sensory processing is now considered a diagnostic feature of autism. Although multisensory integration (MSI) may have cascading effects on the development of higher-level skills such as socio-communicative functioning, there is a clear lack of understanding of how autistic individuals integrate multiple sensory inputs. Multisensory dynamic information is a more ecological construct than static stimuli, reflecting naturalistic sensory experiences given that our environment involves moving stimulation of more than one sensory modality at a time. In particular, depth movement informs about crucial social (approaching to interact) and non-social (avoiding threats/collisions) information. As autistic characteristics are distributed on a spectrum over clinical and general populations, our work aimed to explore the multisensory integration of depth cues in the autistic personality spectrum, using a go/no-go detection task. The autistic profile of 38 participants from the general population was assessed using questionnaires extensively used in the literature. Participants performed a detection task of auditory and/or visual depth moving stimuli compared to static stimuli. We found that subjects with high-autistic traits overreacted to depth movement and exhibited faster reaction times to audiovisual cues, particularly when the audiovisual stimuli were looming and/or were presented at a fast speed. These results provide evidence of sensory particularities in people with high-autistic traits and suggest that low-level stages of multisensory integration could operate differently all along the autistic personality spectrum.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 6","pages":"527-556"},"PeriodicalIF":1.6,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10015075","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Does Task-Irrelevant Brightness Modulation Affect Auditory Contrast Processing? Exploring the Interplay Between Temporal Synchrony and Stimulus Salience","authors":"H. Chow, Danielle Briggs, V. Ciaramitaro","doi":"10.1163/22134808-bja10102","DOIUrl":"https://doi.org/10.1163/22134808-bja10102","url":null,"abstract":"\u0000Stimulus factors such as timing, spatial location, and stimulus effectiveness affect whether and how information across the senses is integrated. Extending recent work highlighting interactions between stimulus factors, here we investigated the influence of visual information on auditory processing, complementing previous studies on the influence of auditory information on visual processing. We hypothesized that task-irrelevant and spatially non-informative visual information would enhance auditory contrast processing, when visual information was at an optimal salience level and changed synchronously with the sound. We asked human observers to indicate the location of an amplitude-modulated white-noise sound, while its loudness against a constant white-noise background varied across trials. To test for the influence of task-irrelevant visual information, we modulated screen brightness smoothly (Experiment 1) or transiently (Experiment 2) in phase or out of phase with the amplitude modulation of the target sound. In addition, to test for the interaction between temporal synchrony and stimulus salience, maximum brightness varied systematically across trials. Auditory contrast thresholds were compared across conditions. Results showed that task-irrelevant visual information did not alter auditory contrast thresholds regardless of the nature of modulation of brightness, contrary to our expectations. Nonetheless, task-irrelevant visual information modulated in phase with the target sound reduced auditory contrast thresholds if we accounted for individual differences in the optimal salience required for the largest multisensory effects. Our results are discussed in light of several stimulus factors that might be critical in modulating multisensory enhancement.","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":""},"PeriodicalIF":1.6,"publicationDate":"2023-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45356022","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Linking Auditory-Induced Bouncing and Auditory-Induced Illusory Crescents: an Individual-Differences Approach","authors":"Hauke S. Meyerhoff, M. Stegemann, C. Frings","doi":"10.1163/22134808-bja10100","DOIUrl":"https://doi.org/10.1163/22134808-bja10100","url":null,"abstract":"When two disks move toward each other, overlap, and then move apart, the visual system can resolve the ambiguity either as two disks streaming past each other or two disks bouncing off each other. Presenting a brief beep at the moment of overlap has been observed to increase the proportion of reported bouncing impressions (i.e., auditory-induced bouncing) as well as to reduce the perceived overlap between the disks (leaving a larger uncovered crescent; auditory-induced illusory crescents). Previous research has speculated about the relationship between both variables, but no direct evidence has been reported yet. We present an individual-differences study in which our participants completed the bouncing/streaming task as well as the illusory crescent task on two consecutive days (to obtain test–retest reliabilities). We obtained acceptable to good reliabilities for the effect of the tone in both dependent measures. Most importantly, auditory-induced bouncing and auditory-induced illusory crescents were correlated in the moderate range suggesting that both illusions are related and share common underlying cognitions. Yet, moderate correlations also indicate that both measures partially capture distinct aspects of the object correspondence.","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":""},"PeriodicalIF":1.6,"publicationDate":"2023-05-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48645705","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}