Seeing and Perceiving最新文献

筛选
英文 中文
Synaesthesia and the SNARC effect 联觉和SNARC效应
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X648477
Clare N. Jonas
{"title":"Synaesthesia and the SNARC effect","authors":"Clare N. Jonas","doi":"10.1163/187847612X648477","DOIUrl":"https://doi.org/10.1163/187847612X648477","url":null,"abstract":"In number-form synaesthesia, numbers become explicitly mapped onto portions of space in the mind’s eye or around the body. However, non-synaesthetes are also known to map number onto space, though in an implicit way. For example, those who are literate in a language that is written in a left-to-right direction are likely to assign small numbers to the left side of space and large numbers to the right side of space (e.g., Dehaene et al., 1993). In non-synaesthetes, this mapping is flexible (e.g., numbers map onto a circular form if the participant is primed to do so by the appearance of a clock-face), which has been interpreted as a response to task demands (e.g., Bachtold et al., 1998) or as evidence of a linguistically-mediated, rather than a direct, link between number and space (e.g., Proctor and Cho, 2006). We investigated whether synaesthetes’ number forms show the same flexibility during an odd-or-even judgement task that tapped linguistic associations between number and space (following Gevers et al., 2010). Synaesthetes and non-synaesthetes alike mapped small numbers to the verbal label ‘left’ and large numbers to the verbal label ‘right’. This surprising result may indicate that synaesthetes’ number forms are also the result of a linguistic link between number and space, instead of a direct link between the two, or that performance on tasks such as these is not mediated by the number form.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"221-221"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X648477","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Generalization of visual shapes by flexible and simple rules. 用灵活和简单的规则概括视觉形状。
Seeing and Perceiving Pub Date : 2012-01-01 Epub Date: 2011-07-19 DOI: 10.1163/187847511X571519
Bart Ons, Johan Wagemans
{"title":"Generalization of visual shapes by flexible and simple rules.","authors":"Bart Ons,&nbsp;Johan Wagemans","doi":"10.1163/187847511X571519","DOIUrl":"https://doi.org/10.1163/187847511X571519","url":null,"abstract":"<p><p>Rules and similarity are at the heart of our understanding of human categorization. However, it is difficult to distinguish their role as both determinants of categorization are confounded in many real situations. Rules are based on a number of identical properties between objects but these correspondences also make objects appearing more similar. Here, we introduced a stimulus set where rules and similarity were unconfounded and we let participants generalize category examples towards new instances. We also introduced a method based on the frequency distribution of the formed partitions in the stimulus sets, which allowed us to verify the role of rules and similarity in categorization. Our evaluation favoured the rule-based account. The most preferred rules were the simplest ones and they consisted of recurrent visual properties (regularities) in the stimulus set. Additionally, we created different variants of the same stimulus set and tested the moderating influence of small changes in appearance of the stimulus material. A conceptual manipulation (Experiment 1) had no influence but all visual manipulations (Experiment 2 and 3) had strong influences in participants' reliance on particular rules, indicating that prior beliefs of category defining rules are rather flexible.</p>","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 3-4","pages":"237-61"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847511X571519","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"30016545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Electrophysiological correlates of tactile and visual perception during goal-directed movement 目标导向运动中触觉和视觉感知的电生理关联
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X648008
G. Juravle, T. Heed, C. Spence, B. Roeder
{"title":"Electrophysiological correlates of tactile and visual perception during goal-directed movement","authors":"G. Juravle, T. Heed, C. Spence, B. Roeder","doi":"10.1163/187847612X648008","DOIUrl":"https://doi.org/10.1163/187847612X648008","url":null,"abstract":"Tactile information arriving at our sensory receptors is differentially processed over the various temporal phases of goal-directed movements. By using event-related potentials (ERPs), we investigated the neuronal correlates of tactile information processing during movement. Participants performed goal-directed reaches for an object placed centrally on the table in front of them. Tactile and visual stimuli were presented in separate trials during the different phases of the movement (i.e., preparation, execution, and post-movement). These stimuli were independently delivered to either the moving or the resting hand. In a control condition, the participants only performed the movement, while omission (movement-only) ERPs were recorded. Participants were told to ignore the presence or absence of any sensory events and solely concentrate on the execution of the movement. The results highlighted enhanced ERPs between 80 and 200 ms after tactile stimulation, and between 100 and 250 ms after visual stimulation. These modulations were greatest over the execution phase of the goal-directed movement, they were effector-based (i.e., significantly more negative for stimuli presented at the moving hand), and modality-independent (i.e., similar ERP enhancements were observed for both tactile and visual stimuli). The enhanced processing of sensory information over the execution phase of the movement suggests that incoming sensory information may be used for a potential adjustment of the current motor plan. Moreover, these results indicate a tight interaction between attentional mechanisms and the sensorimotor system.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"170-170"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X648008","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Temporal disparity effects on audiovisual integration in low vision individuals 时间视差对低视力个体视听整合的影响
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X648044
Stefano Targher, Valeria Occelli, M. Zampini
{"title":"Temporal disparity effects on audiovisual integration in low vision individuals","authors":"Stefano Targher, Valeria Occelli, M. Zampini","doi":"10.1163/187847612X648044","DOIUrl":"https://doi.org/10.1163/187847612X648044","url":null,"abstract":"Our recent findings have shown that sounds improve visual detection in low vision individuals when the audiovisual pairs are presented simultaneously. The present study purports to investigate possible temporal aspects of the audiovisual enhancement effect that we have previously reported. Low vision participants were asked to detect the presence of a visual stimulus (yes/no task) either presented in isolation or together with an auditory stimulus at different SOAs. In the first experiment, when the sound was always leading the visual stimuli, there was a significant visual detection enhancement even when the visual stimulus was temporally delayed by 400 ms. However, the visual detection improvement was reduced in the second experiment when the sound could randomly lead or lag the visual stimulus. A significant enhancement was found only when the audiovisual stimuli were synchronized. Taken together, the results of the present study seem to suggest that high-level associations between modalities might modulate audiovisual interactions in low vision individuals.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"175-175"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X648044","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Predictable variations in auditory pitch modulate the spatial processing of visual stimuli: An ERP study 听觉音调的可预测变化调节视觉刺激的空间加工:一项ERP研究
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646488
Fátima Vera-Constán, Irune Fernández-Prieto, Joel García-Morera, J. Navarra
{"title":"Predictable variations in auditory pitch modulate the spatial processing of visual stimuli: An ERP study","authors":"Fátima Vera-Constán, Irune Fernández-Prieto, Joel García-Morera, J. Navarra","doi":"10.1163/187847612X646488","DOIUrl":"https://doi.org/10.1163/187847612X646488","url":null,"abstract":"We investigated whether perceiving predictable ‘ups and downs’ in acoustic pitch (as can be heard in musical melodies) can influence the spatial processing of visual stimuli as a consequence of a ‘spatial recoding’ of sound (see Foster and Zatorre, 2010; Rusconi et al., 2006). Event-related potentials (ERPs) were recorded while participants performed a color discrimination task of a visual target that could appear either above or below a centrally-presented fixation point. Each experimental trial started with an auditory isochronous stream of 11 tones including a high- and a low-pitched tone. The visual target appeared isochronously after the last tone. In the ‘non-predictive’ condition, the tones were presented in an erratic fashion (e.g., ‘high-low-low-high-high-low-high …’). In the ‘predictive condition’, the melodic combination of high- and low-pitched tones was highly predictable (e.g., ‘low-high-low-high-low …’). Within the predictive condition, the visual stimuli appeared congruently or incongruently with respect to the melody (‘… low-high-low-high-low-UP’ or ‘… low-high-low-high-low-DOWN’, respectively). Participants showed faster responses when the visual target appeared after a predictive melody. Electrophysiologically, early (25–150 ms) amplitude effects of predictability were observed in frontal and parietal regions, spreading to central regions (N1) afterwards. Predictability effects were also found in the P2–N2 complex and the P3 in central and parietal regions. Significant auditory-to-visual congruency effects were also observed in the parieto-occipital P3 component. Our findings reveal the existence of crossmodal effects of perceiving auditory isochronous melodies on visual temporal orienting. More importantly, our results suggest that pitch information can be transformed into a spatial code that shapes the spatial processing in other modalities such as vision.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"36 1","pages":"25-25"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646488","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Assessing audiovisual saliency and visual-information content in the articulation of consonants and vowels on audiovisual temporal perception 在视听时间感知上评估辅音和元音发音的视听显著性和视觉信息内容
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646514
A. Vatakis, C. Spence
{"title":"Assessing audiovisual saliency and visual-information content in the articulation of consonants and vowels on audiovisual temporal perception","authors":"A. Vatakis, C. Spence","doi":"10.1163/187847612X646514","DOIUrl":"https://doi.org/10.1163/187847612X646514","url":null,"abstract":"Research has revealed different temporal integration windows between and within different speech-tokens. The limited speech-tokens tested to date has not allowed for the proper evaluation of whether such differences are task or stimulus driven? We conducted a series of experiments to investigate how the physical differences associated with speech articulation affect the temporal aspects of audiovisual speech perception. Videos of consonants and vowels uttered by three speakers were presented. Participants made temporal order judgments (TOJs) regarding which speech-stream had been presented first. The sensitivity of participants’ TOJs and the point of subjective simultaneity (PSS) were analyzed as a function of the place, manner of articulation, and voicing for consonants, and the height/backness of the tongue and lip-roundedness for vowels. The results demonstrated that for the case of place of articulation/roundedness, participants were more sensitive to the temporal order of highly-salient speech-signals with smaller visual-leads at the PSS. This was not the case when the manner of articulation/height was evaluated. These findings suggest that the visual-speech signal provides substantial cues to the auditory-signal that modulate the relative processing times required for the perception of the speech-stream. A subsequent experiment explored how the presentation of different sources of visual-information modulated such findings. Videos of three consonants were presented under natural and point-light (PL) viewing conditions revealing parts, or the whole, face. Preliminary analysis revealed no differences in TOJ accuracy under different viewing conditions. However, the PSS data revealed significant differences in viewing conditions depending on the speech token uttered (e.g., larger visual-leads for PL-lip/teeth/tongue-only views).","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"29-29"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646514","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Somatosensory amplification and illusory tactile sensations 体感放大和虚幻触觉
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646569
Vrushant Lakhlani, Kirsten J. McKenzie
{"title":"Somatosensory amplification and illusory tactile sensations","authors":"Vrushant Lakhlani, Kirsten J. McKenzie","doi":"10.1163/187847612X646569","DOIUrl":"https://doi.org/10.1163/187847612X646569","url":null,"abstract":"Experimental studies have demonstrated that it is possible to induce convincing bodily distortions in neurologically healthy individuals, through cross-modal manipulations; such as the rubber hand illusion (Botvinick and Cohen, 1998), the parchment skin illusion (Jousmaki and Hari, 1998) and the Somatic Signal Detection Task (SSDT; Lloyd et al., 2008). It has been shown previously with the SSDT that when a tactile stimulus is presented with a simultaneous light flash, individuals show both increased sensitivity to the tactile stimulus, and the tendency to report feeling the stimulus even when one was not presented; a tendency which varies greatly between individuals but remains constant over time within an individual (McKenzie et al., 2010). Further studies into tactile stimulus discrimination using the Somatic Signal Discrimination Task (SSDiT) have also shown that a concurrent light led to a significant improvement in people’s ability to discriminate ‘weak’ tactile stimuli from ‘strong’ ones, as well as a bias towards reporting any tactile stimulus as ‘strong’ (Poliakoff et al., in preparation), indicating that the light may influence both early and later stages of processing. The current study investigated whether the tendency to report higher numbers of false alarms when carrying out the SSDT is correlated with the tendency to experience higher numbers of cross-modal ‘enhancements’ of weak tactile signals (leading to classifications of ‘weak’ stimuli as strong, and ‘strong’ stimuli as ‘stronger’). Results will be discussed.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"34-34"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646569","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Updating expectencies about audiovisual associations in speech 语音中视听关联期望的更新
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X647946
Tim Paris, Jeesun Kim, C. Davis
{"title":"Updating expectencies about audiovisual associations in speech","authors":"Tim Paris, Jeesun Kim, C. Davis","doi":"10.1163/187847612X647946","DOIUrl":"https://doi.org/10.1163/187847612X647946","url":null,"abstract":"The processing of multisensory information depends on the learned association between sensory cues. In the case of speech there is a well-learned association between the movements of the lips and the subsequent sound. That is, particular lip and mouth movements reliably lead to a specific sound. EEG and MEG studies that have investigated the differences between this ‘congruent’ AV association and other ‘incongruent’ associations have commonly reported ERP differences from 350 ms after sound onset. Using a 256 active electrode EEG system, we tested whether this ‘congruency effect’ would be reduced in the context where most of the trials had an altered audiovisual association (auditory speech paired with mismatched visual lip movements). Participants were presented stimuli over 2 sessions: in one session only 15% were incongruent trials; in the other session, 85% were incongruent trials. We found a congruency effect, showing differences in ERP between congruent and incongruent speech between 350 and 500 ms. Importantly, this effect was reduced within the context of mostly incongruent trials. This reduction in the congruency effect indicates that the way in which AV speech is processed depends on the context it is viewed in. Furthermore, this result suggests that exposure to novel sensory relationships leads to updated expectations regarding the relationship between auditory and visual speech cues.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"120 1","pages":"164-164"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X647946","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428756","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Features of the human rod bipolar cell ERG response during fusion of scotopic flicker. 暗闪烁融合过程中人杆状双极细胞ERG反应的特征。
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612x648792
Allison M Cameron, Jacqueline S C Lam
{"title":"Features of the human rod bipolar cell ERG response during fusion of scotopic flicker.","authors":"Allison M Cameron,&nbsp;Jacqueline S C Lam","doi":"10.1163/187847612x648792","DOIUrl":"https://doi.org/10.1163/187847612x648792","url":null,"abstract":"<p><p>The ability of the eye to distinguish between intermittently presented flash stimuli is a measure of the temporal resolution of vision. The aim of this study was to examine the relationship between the features of the human rod bipolar cell response (as measured from the scotopic ERG b-wave) and the psychophysically measured critical fusion frequency (CFF). Stimuli consisted of dim (-0.04 Td x s), blue flashes presented either singly, or as flash pairs (at a range of time separations, between 5 and 300 ms). Single flashes of double intensity (-0.08 Td x s) were also presented as a reference. Visual responses to flash pairs were measured via (1) recording of the ERG b-wave, and (2) threshold determinations of the CFF using a two-alternative forced-choice method (flicker vs. fused illumination). The results of this experiment suggest that b-wave responses to flash pairs separated by < 100 ms are electrophysiologically similar to those obtained with single flashes of double intensity. Psychophysically, the percepts of flash pairs < 100 ms apart appeared fused. In conclusion, the visual system's ability to discriminate between scotopic stimuli may be determined by the response characteristics of the rod bipolar cell, or perhaps by the rod photoreceptor itself.</p>","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 6","pages":"545-60"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612x648792","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40138004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Combining fiber tracking and functional brain imaging for revealing brain networks involved in auditory–visual integration in humans 结合纤维追踪和功能性脑成像来揭示人类听觉-视觉整合的脑网络
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646280
A. Beer, Tina Plank, Evangelia-Regkina Symeonidou, G. Meyer, M. Greenlee
{"title":"Combining fiber tracking and functional brain imaging for revealing brain networks involved in auditory–visual integration in humans","authors":"A. Beer, Tina Plank, Evangelia-Regkina Symeonidou, G. Meyer, M. Greenlee","doi":"10.1163/187847612X646280","DOIUrl":"https://doi.org/10.1163/187847612X646280","url":null,"abstract":"Previous functional magnetic resonance imaging (MRI) found various brain areas in the temporal and occipital lobe involved in integrating auditory and visual object information. Fiber tracking based on diffusion-weighted MRI suggested neuroanatomical connections between auditory cortex and sub-regions of the temporal and occipital lobe. However, the relationship between functional activity and white-matter tracks remained unclear. Here, we combined probabilistic tracking and functional MRI in order to reveal the structural connections related to auditory–visual object perception. Ten healthy people were examined by diffusion-weighted and functional MRI. During functional examinations they viewed either movies of lip or body movements, listened to corresponding sounds (phonological sounds or body action sounds), or a combination of both. We found that phonological sounds elicited stronger activity in the lateral superior temporal gyrus (STG) than body action sounds. Body movements elicited stronger activity in the lateral occipital cortex than lip movements. Functional activity in the phonological STG region and the lateral occipital body area were mutually modulated (sub-additive) by combined auditory–visual stimulation. Moreover, bimodal stimuli engaged a region in the posterior superior temporal sulcus (STS). Probabilistic tracking revealed white-matter tracks between the auditory cortex and sub-regions of the STS (anterior and posterior) and occipital cortex. The posterior STS region was also found to be relevant for auditory–visual object perception. The anterior STS region showed connections to the phonological STG area and to the lateral occipital body area. Our findings suggest that multisensory networks in the temporal lobe are best revealed by combining functional and structural measures.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"5-5"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646280","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信