Multisensory Research最新文献

筛选
英文 中文
What is the Relation between Chemosensory Perception and Chemosensory Mental Imagery? 化感知觉和化感心理想象之间有什么关系?
IF 1.8 4区 心理学
Multisensory Research Pub Date : 2024-08-27 DOI: 10.1163/22134808-bja10130
Charles Spence
{"title":"What is the Relation between Chemosensory Perception and Chemosensory Mental Imagery?","authors":"Charles Spence","doi":"10.1163/22134808-bja10130","DOIUrl":"https://doi.org/10.1163/22134808-bja10130","url":null,"abstract":"<p><p>The study of chemosensory mental imagery is undoubtedly made more difficult because of the profound individual differences that have been reported in the vividness of (e.g.) olfactory mental imagery. At the same time, the majority of those researchers who have attempted to study people's mental imagery abilities for taste (gustation) have actually mostly been studying flavour mental imagery. Nevertheless, there exists a body of human psychophysical research showing that chemosensory mental imagery exhibits a number of similarities with chemosensory perception. Furthermore, the two systems have frequently been shown to interact with one another, the similarities and differences between chemosensory perception and chemosensory mental imagery at the introspective, behavioural, psychophysical, and cognitive neuroscience levels in humans are considered in this narrative historical review. The latest neuroimaging evidence show that many of the same brain areas are engaged by chemosensory mental imagery as have previously been documented to be involved in chemosensory perception. That said, the pattern of neural connectively is reversed between the 'top-down' control of chemosensory mental imagery and the 'bottom-up' control seen in the case of chemosensory perception. At the same time, however, there remain a number of intriguing questions as to whether it is even possible to distinguish between orthonasal and retronasal olfactory mental imagery, and the extent to which mental imagery for flavour, which most people not only describe as, but also perceive to be, the 'taste' of food and drink, is capable of reactivating the entire flavour network in the human brain.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"1-30"},"PeriodicalIF":1.8,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142082447","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evidence for a Causal Dissociation of the McGurk Effect and Congruent Audiovisual Speech Perception via TMS to the Left pSTS. 通过对左侧 pSTS 的 TMS,证明麦克格克效应与视听言语感知一致的因果关系。
IF 1.8 4区 心理学
Multisensory Research Pub Date : 2024-08-16 DOI: 10.1163/22134808-bja10129
EunSeon Ahn, Areti Majumdar, Taraz G Lee, David Brang
{"title":"Evidence for a Causal Dissociation of the McGurk Effect and Congruent Audiovisual Speech Perception via TMS to the Left pSTS.","authors":"EunSeon Ahn, Areti Majumdar, Taraz G Lee, David Brang","doi":"10.1163/22134808-bja10129","DOIUrl":"10.1163/22134808-bja10129","url":null,"abstract":"<p><p>Congruent visual speech improves speech perception accuracy, particularly in noisy environments. Conversely, mismatched visual speech can alter what is heard, leading to an illusory percept that differs from the auditory and visual components, known as the McGurk effect. While prior transcranial magnetic stimulation (TMS) and neuroimaging studies have identified the left posterior superior temporal sulcus (pSTS) as a causal region involved in the generation of the McGurk effect, it remains unclear whether this region is critical only for this illusion or also for the more general benefits of congruent visual speech (e.g., increased accuracy and faster reaction times). Indeed, recent correlative research suggests that the benefits of congruent visual speech and the McGurk effect rely on largely independent mechanisms. To better understand how these different features of audiovisual integration are causally generated by the left pSTS, we used single-pulse TMS to temporarily disrupt processing within this region while subjects were presented with either congruent or incongruent (McGurk) audiovisual combinations. Consistent with past research, we observed that TMS to the left pSTS reduced the strength of the McGurk effect. Importantly, however, left pSTS stimulation had no effect on the positive benefits of congruent audiovisual speech (increased accuracy and faster reaction times), demonstrating a causal dissociation between the two processes. Our results are consistent with models proposing that the pSTS is but one of multiple critical areas supporting audiovisual speech interactions. Moreover, these data add to a growing body of evidence suggesting that the McGurk effect is an imperfect surrogate measure for more general and ecologically valid audiovisual speech behaviors.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"37 4-5","pages":"341-363"},"PeriodicalIF":1.8,"publicationDate":"2024-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11388023/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142082470","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Audiovisual Speech Perception Benefits are Stable from Preschool through Adolescence. 视听言语感知的益处从学龄前到青春期都很稳定。
IF 1.8 4区 心理学
Multisensory Research Pub Date : 2024-07-03 DOI: 10.1163/22134808-bja10128
Liesbeth Gijbels, Jason D Yeatman, Kaylah Lalonde, Piper Doering, Adrian K C Lee
{"title":"Audiovisual Speech Perception Benefits are Stable from Preschool through Adolescence.","authors":"Liesbeth Gijbels, Jason D Yeatman, Kaylah Lalonde, Piper Doering, Adrian K C Lee","doi":"10.1163/22134808-bja10128","DOIUrl":"10.1163/22134808-bja10128","url":null,"abstract":"<p><p>The ability to leverage visual cues in speech perception - especially in noisy backgrounds - is well established from infancy to adulthood. Yet, the developmental trajectory of audiovisual benefits stays a topic of debate. The inconsistency in findings can be attributed to relatively small sample sizes or tasks that are not appropriate for given age groups. We designed an audiovisual speech perception task that was cognitively and linguistically age-appropriate from preschool to adolescence and recruited a large sample ( N = 161) of children (age 4-15). We found that even the youngest children show reliable speech perception benefits when provided with visual cues and that these benefits are consistent throughout development when auditory and visual signals match. Individual variability is explained by how the child experiences their speech-in-noise performance rather than the quality of the signal itself. This underscores the importance of visual speech for young children who are regularly in noisy environments like classrooms and playgrounds.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"317-340"},"PeriodicalIF":1.8,"publicationDate":"2024-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141753313","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Can Multisensory Olfactory Training Improve Olfactory Dysfunction Caused by COVID-19? 多感官嗅觉训练能改善 COVID-19 引起的嗅觉功能障碍吗?
IF 1.8 4区 心理学
Multisensory Research Pub Date : 2024-07-03 DOI: 10.1163/22134808-bja10127
Gözde Filiz, Simon Bérubé, Claudia Demers, Frank Cloutier, Angela Chen, Valérie Pek, Émilie Hudon, Josiane Bolduc-Bégin, Johannes Frasnelli
{"title":"Can Multisensory Olfactory Training Improve Olfactory Dysfunction Caused by COVID-19?","authors":"Gözde Filiz, Simon Bérubé, Claudia Demers, Frank Cloutier, Angela Chen, Valérie Pek, Émilie Hudon, Josiane Bolduc-Bégin, Johannes Frasnelli","doi":"10.1163/22134808-bja10127","DOIUrl":"10.1163/22134808-bja10127","url":null,"abstract":"<p><p>Approximately 30-60% of people suffer from olfactory dysfunction (OD) such as hyposmia or anosmia after being diagnosed with COVID-19; 15-20% of these cases last beyond resolution of the acute phase. Previous studies have shown that olfactory training can be beneficial for patients affected by OD caused by viral infections of the upper respiratory tract. The aim of the study is to evaluate whether a multisensory olfactory training involving simultaneously tasting and seeing congruent stimuli is more effective than the classical olfactory training. We recruited 68 participants with persistent OD for two months or more after COVID-19 infection; they were divided into three groups. One group received olfactory training which involved smelling four odorants (strawberry, cheese, coffee, lemon; classical olfactory training). The other group received the same olfactory stimuli but presented retronasally (i.e., as droplets on their tongue); while simultaneous and congruent gustatory (i.e., sweet, salty, bitter, sour) and visual (corresponding images) stimuli were presented (multisensory olfactory training). The third group received odorless propylene glycol in four bottles (control group). Training was carried out twice daily for 12 weeks. We assessed olfactory function and olfactory specific quality of life before and after the intervention. Both intervention groups showed a similar significant improvement of olfactory function, although there was no difference in the assessment of quality of life. Both multisensory and classical training can be beneficial for OD following a viral infection; however, only the classical olfactory training paradigm leads to an improvement that was significantly stronger than the control group.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"299-316"},"PeriodicalIF":1.8,"publicationDate":"2024-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141753328","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Glassware Influences the Perception of Orange Juice in Simulated Naturalistic versus Urban Conditions. 玻璃器皿在模拟自然环境与城市环境下对橙汁感知的影响
IF 1.8 4区 心理学
Multisensory Research Pub Date : 2024-06-18 DOI: 10.1163/22134808-bja10126
Chunmao Wu, Pei Li, Charles Spence
{"title":"Glassware Influences the Perception of Orange Juice in Simulated Naturalistic versus Urban Conditions.","authors":"Chunmao Wu, Pei Li, Charles Spence","doi":"10.1163/22134808-bja10126","DOIUrl":"10.1163/22134808-bja10126","url":null,"abstract":"<p><p>The latest research demonstrates that people's perception of orange juice can be influenced by the shape/type of receptacle in which it happens to be served. Two studies are reported that were designed to investigate the impact, if any, that the shape/type of glass might exert over the perception of the contents, the emotions induced on tasting the juice and the consumer's intention to purchase orange juice. The same quantity of orange juice (100 ml) was presented and evaluated in three different glasses: a straight-sided, a curved and a tapered glass. Questionnaires were used to assess taste (aroma, flavour intensity, sweetness, freshness and fruitiness), pleasantness and intention to buy orange juice. Study 2 assessed the impact of the same three glasses in two digitally rendered atmospheric conditions (nature vs urban). In Study 1, the perceived sweetness and pleasantness of the orange juice was significantly influenced by the shape/type of the glass in which it was presented. Study 2 reported significant interactions between condition (nature vs urban) and glass shape (tapered, straight-sided and curved). Perceived aroma, flavour intensity and pleasantness were all significantly affected by the simulated audiovisual context or atmosphere. Compared to the urban condition, perceived aroma, freshness, fruitiness and pleasantness were rated significantly higher in the nature condition. On the other hand, flavour intensity and sweetness were rated significantly higher in the urban condition than in the natural condition. These results are likely to be relevant for those interested in providing food services, or company managers offering beverages to their customers.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"275-297"},"PeriodicalIF":1.8,"publicationDate":"2024-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141421790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Perceptual Adaptation to Noise-Vocoded Speech by Lip-Read Information: No Difference between Dyslexic and Typical Readers. 通过唇读信息对噪声编码语音的感知适应:阅读障碍者与典型阅读者之间没有差异。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2024-05-23 DOI: 10.1163/22134808-bja10125
Faezeh Pourhashemi, Martijn Baart, Jean Vroomen
{"title":"Perceptual Adaptation to Noise-Vocoded Speech by Lip-Read Information: No Difference between Dyslexic and Typical Readers.","authors":"Faezeh Pourhashemi, Martijn Baart, Jean Vroomen","doi":"10.1163/22134808-bja10125","DOIUrl":"10.1163/22134808-bja10125","url":null,"abstract":"<p><p>Auditory speech can be difficult to understand but seeing the articulatory movements of a speaker can drastically improve spoken-word recognition and, on the longer-term, it helps listeners to adapt to acoustically distorted speech. Given that individuals with developmental dyslexia (DD) have sometimes been reported to rely less on lip-read speech than typical readers, we examined lip-read-driven adaptation to distorted speech in a group of adults with DD ( N = 29) and a comparison group of typical readers ( N = 29). Participants were presented with acoustically distorted Dutch words (six-channel noise-vocoded speech, NVS) in audiovisual training blocks (where the speaker could be seen) interspersed with audio-only test blocks. Results showed that words were more accurately recognized if the speaker could be seen (a lip-read advantage), and that performance steadily improved across subsequent auditory-only test blocks (adaptation). There were no group differences, suggesting that perceptual adaptation to disrupted spoken words is comparable for dyslexic and typical readers. These data open up a research avenue to investigate the degree to which lip-read-driven speech adaptation generalizes across different types of auditory degradation, and across dyslexic readers with decoding versus comprehension difficulties.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"243-259"},"PeriodicalIF":1.6,"publicationDate":"2024-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141082704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Is Front associated with Above and Back with Below? Association between Allocentric Representations of Spatial Dimensions. 前与上相关,后与下相关吗?空间维度所有中心表征之间的关联。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2024-05-17 DOI: 10.1163/22134808-bja10124
Lari Vainio, Martti Vainio
{"title":"Is Front associated with Above and Back with Below? Association between Allocentric Representations of Spatial Dimensions.","authors":"Lari Vainio, Martti Vainio","doi":"10.1163/22134808-bja10124","DOIUrl":"10.1163/22134808-bja10124","url":null,"abstract":"<p><p>Previous research has revealed congruency effects between different spatial dimensions such as right and up. In the audiovisual context, high-pitched sounds are associated with the spatial dimensions of up/above and front, while low-pitched sounds are associated with the spatial dimensions of down/below and back. This opens the question of whether there could also be a spatial association between above and front and/or below and back. Participants were presented with a high- or low-pitch stimulus at the time of the onset of the visual stimulus. In one block, participants responded according to the above/below location of the visual target stimulus if the target appeared in front of the reference object, and in the other block, they performed these above/below responses if the target appeared at the back of the reference. In general, reaction times revealed an advantage in processing the target location in the front-above and back-below locations. The front-above/back-below effect was more robust concerning the back-below component of the effect, and significantly larger in reaction times that were slower rather than faster than the median value of a participant. However, the pitch did not robustly influence responding to front/back or above/below locations. We propose that this effect might be based on the conceptual association between different spatial dimensions.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"217-241"},"PeriodicalIF":1.6,"publicationDate":"2024-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140960331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Revisiting the Deviation Effects of Irrelevant Sound on Serial and Nonserial Tasks. 重新审视连贯任务和非连贯任务中无关声音的偏差效应
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2024-05-10 DOI: 10.1163/22134808-bja10123
Yu Nakajima, Hiroshi Ashida
{"title":"Revisiting the Deviation Effects of Irrelevant Sound on Serial and Nonserial Tasks.","authors":"Yu Nakajima, Hiroshi Ashida","doi":"10.1163/22134808-bja10123","DOIUrl":"10.1163/22134808-bja10123","url":null,"abstract":"<p><p>Two types of disruptive effects of irrelevant sound on visual tasks have been reported: the changing-state effect and the deviation effect. The idea that the deviation effect, which arises from attentional capture, is independent of task requirements, whereas the changing-state effect is specific to tasks that require serial processing, has been examined by comparing tasks that do or do not require serial-order processing. While many previous studies used the missing-item task as the nonserial task, it is unclear whether other cognitive tasks lead to similar results regarding the different task specificity of both effects. Kattner et al. (Memory and Cognition, 2023) used the mental-arithmetic task as the nonserial task, and failed to demonstrate the deviation effect. However, there were several procedural factors that could account for the lack of deviation effect, such as differences in design and procedures (e.g., conducted online, intermixed conditions). In the present study, we aimed to investigate whether the deviation effect could be observed in both the serial-recall and mental-arithmetic tasks when these procedural factors were modified. We found strong evidence of the deviation effect in both the serial-recall and the mental-arithmetic tasks when stimulus presentation and experimental design were aligned with previous studies that demonstrated the deviation effect (e.g., conducted in-person, blockwise presentation of sound, etc.). The results support the idea that the deviation effect is not task-specific.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"261-273"},"PeriodicalIF":1.6,"publicationDate":"2024-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140900337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Perceived Audio-Visual Simultaneity Is Recalibrated by the Visual Intensity of the Preceding Trial. 感知到的视听同时性会被前一个试验的视觉强度重新校准。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2024-04-30 DOI: 10.1163/22134808-bja10121
Ryan Horsfall, Neil Harrison, Georg Meyer, Sophie Wuerger
{"title":"Perceived Audio-Visual Simultaneity Is Recalibrated by the Visual Intensity of the Preceding Trial.","authors":"Ryan Horsfall, Neil Harrison, Georg Meyer, Sophie Wuerger","doi":"10.1163/22134808-bja10121","DOIUrl":"https://doi.org/10.1163/22134808-bja10121","url":null,"abstract":"<p><p>A vital heuristic used when making judgements on whether audio-visual signals arise from the same event, is the temporal coincidence of the respective signals. Previous research has highlighted a process, whereby the perception of simultaneity rapidly recalibrates to account for differences in the physical temporal offsets of stimuli. The current paper investigated whether rapid recalibration also occurs in response to differences in central arrival latencies, driven by visual-intensity-dependent processing times. In a behavioural experiment, observers completed a temporal-order judgement (TOJ), simultaneity judgement (SJ) and simple reaction-time (RT) task and responded to audio-visual trials that were preceded by other audio-visual trials with either a bright or dim visual stimulus. It was found that the point of subjective simultaneity shifted, due to the visual intensity of the preceding stimulus, in the TOJ, but not SJ task, while the RT data revealed no effect of preceding intensity. Our data therefore provide some evidence that the perception of simultaneity rapidly recalibrates based on stimulus intensity.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"37 2","pages":"143-162"},"PeriodicalIF":1.6,"publicationDate":"2024-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140877920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Spatial Sensory References for Vestibular Self-Motion Perception. 前庭自我运动感知的空间感觉参考。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2023-12-20 DOI: 10.1163/22134808-bja10117
Silvia Zanchi, Luigi F Cuturi, Giulio Sandini, Monica Gori, Elisa R Ferrè
{"title":"Spatial Sensory References for Vestibular Self-Motion Perception.","authors":"Silvia Zanchi, Luigi F Cuturi, Giulio Sandini, Monica Gori, Elisa R Ferrè","doi":"10.1163/22134808-bja10117","DOIUrl":"10.1163/22134808-bja10117","url":null,"abstract":"<p><p>While navigating through the surroundings, we constantly rely on inertial vestibular signals for self-motion along with visual and acoustic spatial references from the environment. However, the interaction between inertial cues and environmental spatial references is not yet fully understood. Here we investigated whether vestibular self-motion sensitivity is influenced by sensory spatial references. Healthy participants were administered a Vestibular Self-Motion Detection Task in which they were asked to detect vestibular self-motion sensations induced by low-intensity Galvanic Vestibular Stimulation. Participants performed this detection task with or without an external visual or acoustic spatial reference placed directly in front of them. We computed the d prime ( d ' ) as a measure of participants' vestibular sensitivity and the criterion as an index of their response bias. Results showed that the visual spatial reference increased sensitivity to detect vestibular self-motion. Conversely, the acoustic spatial reference did not influence self-motion sensitivity. Both visual and auditory spatial references did not cause changes in response bias. Environmental visual spatial references provide relevant information to enhance our ability to perceive inertial self-motion cues, suggesting a specific interaction between visual and vestibular systems in self-motion perception.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":" ","pages":"75-88"},"PeriodicalIF":1.6,"publicationDate":"2023-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138832890","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信