Multisensory Research最新文献

筛选
英文 中文
Multisensory Effects on Illusory Self-Motion (Vection): the Role of Visual, Auditory, and Tactile Cues. 幻觉自我运动(Vection)的多感官效应:视觉、听觉和触觉线索的作用。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-08-11 DOI: 10.1163/22134808-bja10058
Brandy Murovec, Julia Spaniol, Jennifer L Campos, Behrang Keshavarz
{"title":"Multisensory Effects on Illusory Self-Motion (Vection): the Role of Visual, Auditory, and Tactile Cues.","authors":"Brandy Murovec, Julia Spaniol, Jennifer L Campos, Behrang Keshavarz","doi":"10.1163/22134808-bja10058","DOIUrl":"10.1163/22134808-bja10058","url":null,"abstract":"<p><p>A critical component to many immersive experiences in virtual reality (VR) is vection, defined as the illusion of self-motion. Traditionally, vection has been described as a visual phenomenon, but more recent research suggests that vection can be influenced by a variety of senses. The goal of the present study was to investigate the role of multisensory cues on vection by manipulating the availability of visual, auditory, and tactile stimuli in a VR setting. To achieve this, 24 adults (Mage = 25.04) were presented with a rotating stimulus aimed to induce circular vection. All participants completed trials that included a single sensory cue, a combination of two cues, or all three cues presented together. The size of the field of view (FOV) was manipulated across four levels (no-visuals, small, medium, full). Participants rated vection intensity and duration verbally after each trial. Results showed that all three sensory cues induced vection when presented in isolation, with visual cues eliciting the highest intensity and longest duration. The presence of auditory and tactile cues further increased vection intensity and duration compared to conditions where these cues were not presented. These findings support the idea that vection can be induced via multiple types of sensory inputs and can be intensified when multiple sensory inputs are combined.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39304487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Orienting Auditory Attention through Vision: the Impact of Monaural Listening. 通过视觉引导听觉注意:单耳聆听的影响。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-08-11 DOI: 10.1163/22134808-bja10059
Silvia Turri, Mehdi Rizvi, Giuseppe Rabini, Alessandra Melonio, Rosella Gennari, Francesco Pavani
{"title":"Orienting Auditory Attention through Vision: the Impact of Monaural Listening.","authors":"Silvia Turri,&nbsp;Mehdi Rizvi,&nbsp;Giuseppe Rabini,&nbsp;Alessandra Melonio,&nbsp;Rosella Gennari,&nbsp;Francesco Pavani","doi":"10.1163/22134808-bja10059","DOIUrl":"https://doi.org/10.1163/22134808-bja10059","url":null,"abstract":"<p><p>The understanding of linguistic messages can be made extremely complex by the simultaneous presence of interfering sounds, especially when they are also linguistic in nature. In two experiments, we tested if visual cues directing attention to spatial or temporal components of speech in noise can improve its identification. The hearing-in-noise task required identification of a five-digit sequence (target) embedded in a stream of time-reversed speech. Using a custom-built device located in front of the participant, we delivered visual cues to orient attention to the location of target sounds and/or their temporal window. In Exp. 1 ( n = 14), we validated this visual-to-auditory cueing method in normal-hearing listeners, tested under typical binaural listening conditions. In Exp. 2 ( n = 13), we assessed the efficacy of the same visual cues in normal-hearing listeners wearing a monaural ear plug, to study the effects of simulated monaural and conductive hearing loss on visual-to-auditory attention orienting. While Exp. 1 revealed a benefit of both spatial and temporal visual cues for hearing in noise, Exp. 2 showed that only the temporal visual cues remained effective during monaural listening. These findings indicate that when the acoustic experience is altered, visual-to-auditory attention orienting is more robust for temporal compared to spatial attributes of the auditory stimuli. These findings have implications for the relation between spatial and temporal attributes of sound objects, and when planning devices to orient audiovisual attention for subjects suffering from hearing loss.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39304486","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Neural Basis of the Sound-Symbolic Crossmodal Correspondence Between Auditory Pseudowords and Visual Shapes. 听觉伪词与视觉形状之间声音-符号跨模态对应的神经基础
IF 1.8 4区 心理学
Multisensory Research Pub Date : 2021-08-11 DOI: 10.1163/22134808-bja10060
Kelly McCormick, Simon Lacey, Randall Stilla, Lynne C Nygaard, K Sathian
{"title":"Neural Basis of the Sound-Symbolic Crossmodal Correspondence Between Auditory Pseudowords and Visual Shapes.","authors":"Kelly McCormick, Simon Lacey, Randall Stilla, Lynne C Nygaard, K Sathian","doi":"10.1163/22134808-bja10060","DOIUrl":"10.1163/22134808-bja10060","url":null,"abstract":"<p><p>Sound symbolism refers to the association between the sounds of words and their meanings, often studied using the crossmodal correspondence between auditory pseudowords, e.g., 'takete' or 'maluma', and pointed or rounded visual shapes, respectively. In a functional magnetic resonance imaging study, participants were presented with pseudoword-shape pairs that were sound-symbolically congruent or incongruent. We found no significant congruency effects in the blood oxygenation level-dependent (BOLD) signal when participants were attending to visual shapes. During attention to auditory pseudowords, however, we observed greater BOLD activity for incongruent compared to congruent audiovisual pairs bilaterally in the intraparietal sulcus and supramarginal gyrus, and in the left middle frontal gyrus. We compared this activity to independent functional contrasts designed to test competing explanations of sound symbolism, but found no evidence for mediation via language, and only limited evidence for accounts based on multisensory integration and a general magnitude system. Instead, we suggest that the observed incongruency effects are likely to reflect phonological processing and/or multisensory attention. These findings advance our understanding of sound-to-meaning mapping in the brain.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.8,"publicationDate":"2021-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9196751/pdf/nihms-1804729.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10098984","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Exploring Reference Frame Integration Using Response Demands in a Tactile Temporal-Order Judgement Task. 在触觉时序判断任务中利用反应要求探索参照系整合。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-07-23 DOI: 10.1163/22134808-bja10057
Kaian Unwalla, Daniel Goldreich, David I Shore
{"title":"Exploring Reference Frame Integration Using Response Demands in a Tactile Temporal-Order Judgement Task.","authors":"Kaian Unwalla, Daniel Goldreich, David I Shore","doi":"10.1163/22134808-bja10057","DOIUrl":"10.1163/22134808-bja10057","url":null,"abstract":"<p><p>Exploring the world through touch requires the integration of internal (e.g., anatomical) and external (e.g., spatial) reference frames - you only know what you touch when you know where your hands are in space. The deficit observed in tactile temporal-order judgements when the hands are crossed over the midline provides one tool to explore this integration. We used foot pedals and required participants to focus on either the hand that was stimulated first (an anatomical bias condition) or the location of the hand that was stimulated first (a spatiotopic bias condition). Spatiotopic-based responses produce a larger crossed-hands deficit, presumably by focusing observers on the external reference frame. In contrast, anatomical-based responses focus the observer on the internal reference frame and produce a smaller deficit. This manipulation thus provides evidence that observers can change the relative weight given to each reference frame. We quantify this effect using a probabilistic model that produces a population estimate of the relative weight given to each reference frame. We show that a spatiotopic bias can result in either a larger external weight (Experiment 1) or a smaller internal weight (Experiment 2) and provide an explanation of when each one would occur.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39300169","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Metacognition and Crossmodal Correspondences Between Auditory Attributes and Saltiness in a Large Sample Study. 大样本研究中听觉属性与咸度之间的元认知和跨模态对应关系
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-07-23 DOI: 10.1163/22134808-bja10055
Qian Janice Wang, Steve Keller, Charles Spence
{"title":"Metacognition and Crossmodal Correspondences Between Auditory Attributes and Saltiness in a Large Sample Study.","authors":"Qian Janice Wang, Steve Keller, Charles Spence","doi":"10.1163/22134808-bja10055","DOIUrl":"10.1163/22134808-bja10055","url":null,"abstract":"<p><p>Mounting evidence demonstrates that people make surprisingly consistent associations between auditory attributes and a number of the commonly-agreed basic tastes. However, the sonic representation of (association with) saltiness has remained rather elusive. In the present study, a crowd-sourced online study ( n = 1819 participants) was conducted to determine the acoustical/musical attributes that best match saltiness, as well as participants' confidence levels in their choices. Based on previous literature on crossmodal correspondences involving saltiness, thirteen attributes were selected to cover a variety of temporal, tactile, and emotional associations. The results revealed that saltiness was associated most strongly with a long decay time, high auditory roughness, and a regular rhythm. In terms of emotional associations, saltiness was matched with negative valence, high arousal, and minor mode. Moreover, significantly higher average confidence ratings were observed for those saltiness-matching choices for which there was majority agreement, suggesting that individuals were more confident about their own judgments when it matched with the group response, therefore providing support for the so-called 'consensuality principle'. Taken together, these results help to uncover the complex interplay of mechanisms behind seemingly surprising crossmodal correspondences between sound attributes and taste.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39300168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Perceptions of Audio-Visual Impact Events in Younger and Older Adults. 年轻人和老年人对视听影响事件的感知。
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-07-21 DOI: 10.1163/22134808-bja10056
Katherine Bak, George S W Chan, Michael Schutz, Jennifer L Campos
{"title":"Perceptions of Audio-Visual Impact Events in Younger and Older Adults.","authors":"Katherine Bak, George S W Chan, Michael Schutz, Jennifer L Campos","doi":"10.1163/22134808-bja10056","DOIUrl":"10.1163/22134808-bja10056","url":null,"abstract":"<p><p>Previous studies have examined whether audio-visual integration changes in older age, with some studies reporting age-related differences and others reporting no differences. Most studies have either used very basic and ambiguous stimuli (e.g., flash/beep) or highly contextualized, causally related stimuli (e.g., speech). However, few have used tasks that fall somewhere between the extremes of this continuum, such as those that include contextualized, causally related stimuli that are not speech-based; for example, audio-visual impact events. The present study used a paradigm requiring duration estimates and temporal order judgements (TOJ) of audio-visual impact events. Specifically, the Schutz-Lipscomb illusion, in which the perceived duration of a percussive tone is influenced by the length of the visual striking gesture, was examined in younger and older adults. Twenty-one younger and 21 older adult participants were presented with a visual point-light representation of a percussive impact event (i.e., a marimbist striking their instrument with a long or short gesture) combined with a percussive auditory tone. Participants completed a tone duration judgement task and a TOJ task. Five audio-visual temporal offsets (-400 to +400 ms) and five spatial offsets (from -90 to +90°) were randomly introduced. Results demonstrated that the strength of the illusion did not differ between older and younger adults and was not influenced by spatial or temporal offsets. Older adults showed an 'auditory first bias' when making TOJs. The current findings expand what is known about age-related differences in audio-visual integration by considering them in the context of impact-related events.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39213139","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Lightness/Pitch Crossmodal Correspondence Modulates the Rubin Face/Vase Perception. 亮度/音调跨模态对应调节鲁宾面孔/花瓶感知
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-06-16 DOI: 10.1163/22134808-bja10054
Mick Zeljko, Philip M Grove, Ada Kritikos
{"title":"The Lightness/Pitch Crossmodal Correspondence Modulates the Rubin Face/Vase Perception.","authors":"Mick Zeljko, Philip M Grove, Ada Kritikos","doi":"10.1163/22134808-bja10054","DOIUrl":"10.1163/22134808-bja10054","url":null,"abstract":"<p><p>We examine whether crossmodal correspondences (CMCs) modulate perceptual disambiguation by considering the influence of lightness/pitch congruency on the perceptual resolution of the Rubin face/vase (RFV). We randomly paired a black-and-white RFV (black faces and white vase, or vice versa) with either a high or low pitch tone and found that CMC congruency biases the dominant visual percept. The perceptual option that was CMC-congruent with the tone (white/high pitch or black/low pitch) was reported significantly more often than the perceptual option CMC-incongruent with the tone (white/low pitch or black/high pitch). However, the effect was only observed for stimuli presented for longer and not shorter durations suggesting a perceptual effect rather than a response bias, and moreover, we infer an effect on perceptual reversals rather than initial percepts. We found that the CMC congruency effect for longer-duration stimuli only occurred after prior exposure to the stimuli of several minutes, suggesting that the CMC congruency develops over time. These findings extend the observed effects of CMCs from relatively low-level feature-based effects to higher-level object-based perceptual effects (specifically, resolving ambiguity) and demonstrate that an entirely new category of crossmodal factors (CMC congruency) influence perceptual disambiguation in bistability.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39241388","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Effects of Cue Reliability on Crossmodal Recalibration in Adults and Children. 线索可靠性对成人和儿童跨模态再校准的影响
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-05-31 DOI: 10.1163/22134808-bja10053
Sophie Rohlf, Patrick Bruns, Brigitte Röder
{"title":"The Effects of Cue Reliability on Crossmodal Recalibration in Adults and Children.","authors":"Sophie Rohlf, Patrick Bruns, Brigitte Röder","doi":"10.1163/22134808-bja10053","DOIUrl":"10.1163/22134808-bja10053","url":null,"abstract":"<p><p>Reliability-based cue combination is a hallmark of multisensory integration, while the role of cue reliability for crossmodal recalibration is less understood. The present study investigated whether visual cue reliability affects audiovisual recalibration in adults and children. Participants had to localize sounds, which were presented either alone or in combination with a spatially discrepant high- or low-reliability visual stimulus. In a previous study we had shown that the ventriloquist effect (indicating multisensory integration) was overall larger in the children groups and that the shift in sound localization toward the spatially discrepant visual stimulus decreased with visual cue reliability in all groups. The present study replicated the onset of the immediate ventriloquist aftereffect (a shift in unimodal sound localization following a single exposure of a spatially discrepant audiovisual stimulus) at the age of 6-7 years. In adults the immediate ventriloquist aftereffect depended on visual cue reliability, whereas the cumulative ventriloquist aftereffect (reflecting the audiovisual spatial discrepancies over the complete experiment) did not. In 6-7-year-olds the immediate ventriloquist aftereffect was independent of visual cue reliability. The present results are compatible with the idea of immediate and cumulative crossmodal recalibrations being dissociable processes and that the immediate ventriloquist aftereffect is more closely related to genuine multisensory integration.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39040401","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Effect of Simultaneously Presented Words and Auditory Tones on Visuomotor Performance. 同时出现的单词和听觉音调对视觉运动表现的影响
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-05-28 DOI: 10.1163/22134808-bja10052
Rita Mendonça, Margarida V Garrido, Gün R Semin
{"title":"The Effect of Simultaneously Presented Words and Auditory Tones on Visuomotor Performance.","authors":"Rita Mendonça, Margarida V Garrido, Gün R Semin","doi":"10.1163/22134808-bja10052","DOIUrl":"10.1163/22134808-bja10052","url":null,"abstract":"<p><p>The experiment reported here used a variation of the spatial cueing task to examine the effects of unimodal and bimodal attention-orienting primes on target identification latencies and eye gaze movements. The primes were a nonspatial auditory tone and words known to drive attention consistent with the dominant writing and reading direction, as well as introducing a semantic, temporal bias (past-future) on the horizontal dimension. As expected, past-related (visual) word primes gave rise to shorter response latencies on the left hemifield and future-related words on the right. This congruency effect was differentiated by an asymmetric performance on the right space following future words and driven by the left-to-right trajectory of scanning habits that facilitated search times and eye gaze movements to lateralized targets. Auditory tone prime alone acted as an alarm signal, boosting visual search and reducing response latencies. Bimodal priming, i.e., temporal visual words paired with the auditory tone, impaired performance by delaying visual attention and response times relative to the unimodal visual word condition. We conclude that bimodal primes were no more effective in capturing participants' spatial attention than the unimodal auditory and visual primes. Their contribution to the literature on multisensory integration is discussed.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-05-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39040402","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Typical Crossmodal Numerosity Perception in Preterm Newborns. 早产新生儿的典型跨模态数字感知
IF 1.6 4区 心理学
Multisensory Research Pub Date : 2021-05-12 DOI: 10.1163/22134808-bja10051
Giovanni Anobile, Maria C Morrone, Daniela Ricci, Francesca Gallini, Ilaria Merusi, Francesca Tinelli
{"title":"Typical Crossmodal Numerosity Perception in Preterm Newborns.","authors":"Giovanni Anobile, Maria C Morrone, Daniela Ricci, Francesca Gallini, Ilaria Merusi, Francesca Tinelli","doi":"10.1163/22134808-bja10051","DOIUrl":"10.1163/22134808-bja10051","url":null,"abstract":"<p><p>Premature birth is associated with a high risk of damage in the parietal cortex, a key area for numerical and non-numerical magnitude perception and mathematical reasoning. Children born preterm have higher rates of learning difficulties for school mathematics. In this study, we investigated how preterm newborns (born at 28-34 weeks of gestation age) and full-term newborns respond to visual numerosity after habituation to auditory stimuli of different numerosities. The results show that the two groups have a similar preferential looking response to visual numerosity, both preferring the incongruent set after crossmodal habituation. These results suggest that the numerosity system is resistant to prematurity.</p>","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":null,"pages":null},"PeriodicalIF":1.6,"publicationDate":"2021-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"38909099","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信