Multisensory Research最新文献

筛选
英文 中文
Audio-Visual Interference During Motion Discrimination in Starlings. 欧椋鸟运动识别过程中的视听干扰。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2023-01-17 DOI: 10.1163/22134808-bja10092
Gesa Feenders, Georg M Klump
{"title":"Audio-Visual Interference During Motion Discrimination in Starlings.","authors":"Gesa Feenders,&nbsp;Georg M Klump","doi":"10.1163/22134808-bja10092","DOIUrl":"https://doi.org/10.1163/22134808-bja10092","url":null,"abstract":"<p><p>Motion discrimination is essential for animals to avoid collisions, to escape from predators, to catch prey or to communicate. Although most terrestrial vertebrates can benefit by combining concurrent stimuli from sound and vision to obtain a most salient percept of the moving object, there is little research on the mechanisms involved in such cross-modal motion discrimination. We used European starlings as a model with a well-studied visual and auditory system. In a behavioural motion discrimination task with visual and acoustic stimuli, we investigated the effects of cross-modal interference and attentional processes. Our results showed an impairment of motion discrimination when the visual and acoustic stimuli moved in opposite directions as compared to congruent motion direction. By presenting an acoustic stimulus of very short duration, thus lacking directional motion information, an additional alerting effect of the acoustic stimulus became evident. Finally, we show that a temporally leading acoustic stimulus did not improve the response behaviour compared to the synchronous presentation of the stimuli as would have been expected in case of major alerting effects. This further supports the importance of congruency and synchronicity in the current test paradigm with a minor role of attentional processes elicited by the acoustic stimulus. Together, our data clearly show cross-modal interference effects in an audio-visual motion discrimination paradigm when carefully selecting real-life stimuli under parameter conditions that meet the known criteria for cross-modal binding.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 2","pages":"181-212"},"PeriodicalIF":1.6,"publicationDate":"2023-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10834687","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Can We Train Multisensory Integration in Adults? A Systematic Review. 我们能训练成人的多感觉统合吗?系统评价。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2023-01-13 DOI: 10.1163/22134808-bja10090
Jessica O'Brien, Amy Mason, Jason Chan, Annalisa Setti
{"title":"Can We Train Multisensory Integration in Adults? A Systematic Review.","authors":"Jessica O'Brien,&nbsp;Amy Mason,&nbsp;Jason Chan,&nbsp;Annalisa Setti","doi":"10.1163/22134808-bja10090","DOIUrl":"https://doi.org/10.1163/22134808-bja10090","url":null,"abstract":"<p><p>The ability to efficiently combine information from different senses is an important perceptual process that underpins much of our daily activities. This process, known as multisensory integration, varies from individual to individual, and is affected by the ageing process, with impaired processing associated with age-related conditions, including balance difficulties, mild cognitive impairment and cognitive decline. Impaired multisensory perception has also been associated with a range of neurodevelopmental conditions, where novel intervention approaches are actively sought, for example dyslexia and autism. However, it remains unclear to what extent and how multisensory perception can be modified by training. This systematic review aims to evaluate the evidence that we can train multisensory perception in neurotypical adults. In all, 1521 studies were identified following a systematic search of the databases PubMed, Scopus, PsychInfo and Web of Science. Following screening for inclusion and exclusion criteria, 27 studies were chosen for inclusion. Study quality was assessed using the Methodological Index for Non-Randomised Studies (MINORS) tool and the Cochrane Risk of Bias tool 2.0 for Randomised Control Trials. We found considerable evidence that in-task feedback training using psychophysics protocols led to improved task performance. The generalisability of this training to other tasks of multisensory integration was inconclusive, with few studies and mixed findings reported. Promising findings from exercise-based training indicate physical activity protocols warrant further investigation as potential training avenues for improving multisensory integration. Future research directions should include trialling training protocols with clinical populations and other groups who would benefit from targeted training to improve inefficient multisensory integration.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 2","pages":"111-180"},"PeriodicalIF":1.6,"publicationDate":"2023-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10835145","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Front matter 前页
4区 心理学
Multisensory Research Pub Date : 2023-01-11 DOI: 10.1163/22134808-00351p14
{"title":"Front matter","authors":"","doi":"10.1163/22134808-00351p14","DOIUrl":"https://doi.org/10.1163/22134808-00351p14","url":null,"abstract":"","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136082543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
'Tasting Imagination': What Role Chemosensory Mental Imagery in Multisensory Flavour Perception? “味觉想象”:化学感觉心理意象在多感官味觉中的作用?
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2022-12-30 DOI: 10.1163/22134808-bja10091
Charles Spence
{"title":"'Tasting Imagination': What Role Chemosensory Mental Imagery in Multisensory Flavour Perception?","authors":"Charles Spence","doi":"10.1163/22134808-bja10091","DOIUrl":"https://doi.org/10.1163/22134808-bja10091","url":null,"abstract":"<p><p>A number of perplexing phenomena in the area of olfactory/flavour perception may fruitfully be explained by the suggestion that chemosensory mental imagery can be triggered automatically by perceptual inputs. In particular, the disconnect between the seemingly limited ability of participants in chemosensory psychophysics studies to distinguish more than two or three odorants in mixtures and the rich and detailed flavour descriptions that are sometimes reported by wine experts; the absence of awareness of chemosensory loss in many elderly individuals; and the insensitivity of the odour-induced taste enhancement (OITE) effect to the mode of presentation of olfactory stimuli (i.e., orthonasal or retronasal). The suggestion made here is that the theory of predictive coding, developed first in the visual modality, be extended to chemosensation. This may provide a fruitful way of thinking about the interaction between mental imagery and perception in the experience of aromas and flavours. Accepting such a suggestion also raises some important questions concerning the ecological validity/meaning of much of the chemosensory psychophysics literature that has been published to date.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 1","pages":"93-109"},"PeriodicalIF":1.6,"publicationDate":"2022-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10708023","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
The Impact of Singing on Visual and Multisensory Speech Perception in Children on the Autism Spectrum. 唱歌对自闭症儿童视觉和多感官言语感知的影响
IF 1.8 4区 心理学
Multisensory Research Pub Date : 2022-12-30 DOI: 10.1163/22134808-bja10087
Jacob I Feldman, Alexander Tu, Julie G Conrad, Wayne Kuang, Pooja Santapuram, Tiffany G Woynaroski
{"title":"The Impact of Singing on Visual and Multisensory Speech Perception in Children on the Autism Spectrum.","authors":"Jacob I Feldman, Alexander Tu, Julie G Conrad, Wayne Kuang, Pooja Santapuram, Tiffany G Woynaroski","doi":"10.1163/22134808-bja10087","DOIUrl":"10.1163/22134808-bja10087","url":null,"abstract":"<p><p>Autistic children show reduced multisensory integration of audiovisual speech stimuli in response to the McGurk illusion. Previously, it has been shown that adults can integrate sung McGurk tokens. These sung speech tokens offer more salient visual and auditory cues, in comparison to the spoken tokens, which may increase the identification and integration of visual speech cues in autistic children. Forty participants (20 autism, 20 non-autistic peers) aged 7-14 completed the study. Participants were presented with speech tokens in four modalities: auditory-only, visual-only, congruent audiovisual, and incongruent audiovisual (i.e., McGurk; auditory 'ba' and visual 'ga'). Tokens were also presented in two formats: spoken and sung. Participants indicated what they perceived via a four-button response box (i.e., 'ba', 'ga', 'da', or 'tha'). Accuracies and perception of the McGurk illusion were calculated for each modality and format. Analysis of visual-only identification indicated a significant main effect of format, whereby participants were more accurate in sung versus spoken trials, but no significant main effect of group or interaction effect. Analysis of the McGurk trials indicated no significant main effect of format or group and no significant interaction effect. Sung speech tokens improved identification of visual speech cues, but did not boost the integration of visual cues with heard speech across groups. Additional work is needed to determine what properties of spoken speech contributed to the observed improvement in visual accuracy and to evaluate whether more prolonged exposure to sung speech may yield effects on multisensory integration.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 1","pages":"57-74"},"PeriodicalIF":1.8,"publicationDate":"2022-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9924934/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10707539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Crossmodal Texture Perception Is Illumination-Dependent. 交叉模态纹理感知依赖于光照。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2022-12-28 DOI: 10.1163/22134808-bja10089
Karina Kangur, Martin Giesel, Julie M Harris, Constanze Hesse
{"title":"Crossmodal Texture Perception Is Illumination-Dependent.","authors":"Karina Kangur,&nbsp;Martin Giesel,&nbsp;Julie M Harris,&nbsp;Constanze Hesse","doi":"10.1163/22134808-bja10089","DOIUrl":"https://doi.org/10.1163/22134808-bja10089","url":null,"abstract":"<p><p>Visually perceived roughness of 3D textures varies with illumination direction. Surfaces appear rougher when the illumination angle is lowered resulting in a lack of roughness constancy. Here we aimed to investigate whether the visual system also relies on illumination-dependent features when judging roughness in a crossmodal matching task or whether it can access illumination-invariant surface features that can also be evaluated by the tactile system. Participants ( N = 32) explored an abrasive paper of medium physical roughness either tactually, or visually under two different illumination conditions (top vs oblique angle). Subsequently, they had to judge if a comparison stimulus (varying in physical roughness) matched the previously explored standard. Matching was either performed using the same modality as during exploration (intramodal) or using a different modality (crossmodal). In the intramodal conditions, participants performed equally well independent of the modality or illumination employed. In the crossmodal conditions, participants selected rougher tactile matches after exploring the standard visually under oblique illumination than under top illumination. Conversely, after tactile exploration, they selected smoother visual matches under oblique than under top illumination. These findings confirm that visual roughness perception depends on illumination direction and show, for the first time, that this failure of roughness constancy also transfers to judgements made crossmodally.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 1","pages":"75-91"},"PeriodicalIF":1.6,"publicationDate":"2022-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10707537","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Body Pitch Together With Translational Body Motion Biases the Subjective Haptic Vertical. 身体俯仰与平移身体运动一起使主观触觉产生垂直偏差。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2022-12-20 DOI: 10.1163/22134808-bja10086
Chia-Huei Tseng, Hiu Mei Chow, Lothar Spillmann, Matt Oxner, Kenzo Sakurai
{"title":"Body Pitch Together With Translational Body Motion Biases the Subjective Haptic Vertical.","authors":"Chia-Huei Tseng,&nbsp;Hiu Mei Chow,&nbsp;Lothar Spillmann,&nbsp;Matt Oxner,&nbsp;Kenzo Sakurai","doi":"10.1163/22134808-bja10086","DOIUrl":"https://doi.org/10.1163/22134808-bja10086","url":null,"abstract":"<p><p>Accurate perception of verticality is critical for postural maintenance and successful physical interaction with the world. Although previous research has examined the independent influences of body orientation and self-motion under well-controlled laboratory conditions, these factors are constantly changing and interacting in the real world. In this study, we examine the subjective haptic vertical in a real-world scenario. Here, we report a bias of verticality perception in a field experiment on the Hong Kong Peak Tram as participants traveled on a slope ranging from 6° to 26°. Mean subjective haptic vertical (SHV) increased with slope by as much as 15°, regardless of whether the eyes were open (Experiment 1) or closed (Experiment 2). Shifting the body pitch by a fixed degree in an effort to compensate for the mountain slope failed to reduce the verticality bias (Experiment 3). These manipulations separately rule out visual and vestibular inputs about absolute body pitch as contributors to our observed bias. Observations collected on a tram traveling on level ground (Experiment 4A) or in a static dental chair with a range of inclinations similar to those encountered on the mountain tram (Experiment 4B) showed no significant deviation of the subjective vertical from gravity. We conclude that the SHV error is due to a combination of large, dynamic body pitch and translational motion. These observations made in a real-world scenario represent an incentive to neuroscientists and aviation experts alike for studying perceived verticality under field conditions and raising awareness of dangerous misperceptions of verticality when body pitch and translational self-motion come together.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 1","pages":"1-29"},"PeriodicalIF":1.6,"publicationDate":"2022-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10707538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Dynamic Weighting of Time-Varying Visual and Auditory Evidence During Multisensory Decision Making. 多感官决策中时变视觉和听觉证据的动态加权。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2022-12-01 DOI: 10.1163/22134808-bja10088
Rosanne R M Tuip, Wessel van der Ham, Jeannette A M Lorteije, Filip Van Opstal
{"title":"Dynamic Weighting of Time-Varying Visual and Auditory Evidence During Multisensory Decision Making.","authors":"Rosanne R M Tuip,&nbsp;Wessel van der Ham,&nbsp;Jeannette A M Lorteije,&nbsp;Filip Van Opstal","doi":"10.1163/22134808-bja10088","DOIUrl":"https://doi.org/10.1163/22134808-bja10088","url":null,"abstract":"<p><p>Perceptual decision-making in a dynamic environment requires two integration processes: integration of sensory evidence from multiple modalities to form a coherent representation of the environment, and integration of evidence across time to accurately make a decision. Only recently studies started to unravel how evidence from two modalities is accumulated across time to form a perceptual decision. One important question is whether information from individual senses contributes equally to multisensory decisions. We designed a new psychophysical task that measures how visual and auditory evidence is weighted across time. Participants were asked to discriminate between two visual gratings, and/or two sounds presented to the right and left ear based on respectively contrast and loudness. We varied the evidence, i.e., the contrast of the gratings and amplitude of the sound, over time. Results showed a significant increase in performance accuracy on multisensory trials compared to unisensory trials, indicating that discriminating between two sources is improved when multisensory information is available. Furthermore, we found that early evidence contributed most to sensory decisions. Weighting of unisensory information during audiovisual decision-making dynamically changed over time. A first epoch was characterized by both visual and auditory weighting, during the second epoch vision dominated and the third epoch finalized the weighting profile with auditory dominance. Our results suggest that during our task multisensory improvement is generated by a mechanism that requires cross-modal interactions but also dynamically evokes dominance switching.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 1","pages":"31-56"},"PeriodicalIF":1.6,"publicationDate":"2022-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10708021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Prior Exposure to Dynamic Visual Displays Reduces Vection Onset Latency. 先前暴露于动态视觉显示减少向量开始延迟。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2022-11-16 DOI: 10.1163/22134808-bja10084
Jing Ni, Hiroyuki Ito, Masaki Ogawa, Shoji Sunaga, Stephen Palmisano
{"title":"Prior Exposure to Dynamic Visual Displays Reduces Vection Onset Latency.","authors":"Jing Ni,&nbsp;Hiroyuki Ito,&nbsp;Masaki Ogawa,&nbsp;Shoji Sunaga,&nbsp;Stephen Palmisano","doi":"10.1163/22134808-bja10084","DOIUrl":"https://doi.org/10.1163/22134808-bja10084","url":null,"abstract":"<p><p>While compelling illusions of self-motion (vection) can be induced purely by visual motion, they are rarely experienced immediately. This vection onset latency is thought to represent the time required to resolve sensory conflicts between the stationary observer's visual and nonvisual information about self-motion. In this study, we investigated whether manipulations designed to increase the weightings assigned to vision (compared to the nonvisual senses) might reduce vection onset latency. We presented two different types of visual priming displays directly before our main vection-inducing displays: (1) 'random motion' priming displays - designed to pre-activate general, as opposed to self-motion-specific, visual motion processing systems; and (2) 'dynamic no-motion' priming displays - designed to stimulate vision, but not generate conscious motion perceptions. Prior exposure to both types of priming displays was found to significantly shorten vection onset latencies for the main self-motion display. These experiments show that vection onset latencies can be reduced by pre-activating the visual system with both types of priming display. Importantly, these visual priming displays did not need to be capable of inducing vection or conscious motion perception in order to produce such benefits.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 7-8","pages":"653-676"},"PeriodicalIF":1.6,"publicationDate":"2022-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10708022","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Can the Perceived Timing of Multisensory Events Predict Cybersickness? 多感官事件的感知时间能否预测晕机?
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2022-10-24 DOI: 10.1163/22134808-bja10083
Ogai Sadiq, Michael Barnett-Cowan
{"title":"Can the Perceived Timing of Multisensory Events Predict Cybersickness?","authors":"Ogai Sadiq,&nbsp;Michael Barnett-Cowan","doi":"10.1163/22134808-bja10083","DOIUrl":"https://doi.org/10.1163/22134808-bja10083","url":null,"abstract":"<p><p>Humans are constantly presented with rich sensory information that the central nervous system (CNS) must process to form a coherent perception of the self and its relation to its surroundings. While the CNS is efficient in processing multisensory information in natural environments, virtual reality (VR) poses challenges of temporal discrepancies that the CNS must solve. These temporal discrepancies between information from different sensory modalities leads to inconsistencies in perception of the virtual environment which often causes cybersickness. Here, we investigate whether individual differences in the perceived relative timing of sensory events, specifically parameters of temporal-order judgement (TOJ), can predict cybersickness. Study 1 examined audiovisual (AV) TOJs while Study 2 examined audio-active head movement (AAHM) TOJs. We deduced metrics of the temporal binding window (TBW) and point of subjective simultaneity (PSS) for a total of 50 participants. Cybersickness was quantified using the Simulator Sickness Questionnaire (SSQ). Study 1 results (correlations and multiple regression) show that the oculomotor SSQ shares a significant yet positive correlation with AV PSS and TBW. While there is a positive correlation between the total SSQ scores and the TBW and PSS, these correlations are not significant. Therefore, although these results are promising, we did not find the same effect for AAHM TBW and PSS. We conclude that AV TOJ may serve as a potential tool to predict cybersickness in VR. Such findings will generate a better understanding of cybersickness which can be used for development of VR to help mitigate discomfort and maximize adoption.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"35 7-8","pages":"623-652"},"PeriodicalIF":1.6,"publicationDate":"2022-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10708024","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信