Seeing and Perceiving最新文献

筛选
英文 中文
Synaesthesia and the SNARC effect 联觉和SNARC效应
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X648477
Clare N. Jonas
{"title":"Synaesthesia and the SNARC effect","authors":"Clare N. Jonas","doi":"10.1163/187847612X648477","DOIUrl":"https://doi.org/10.1163/187847612X648477","url":null,"abstract":"In number-form synaesthesia, numbers become explicitly mapped onto portions of space in the mind’s eye or around the body. However, non-synaesthetes are also known to map number onto space, though in an implicit way. For example, those who are literate in a language that is written in a left-to-right direction are likely to assign small numbers to the left side of space and large numbers to the right side of space (e.g., Dehaene et al., 1993). In non-synaesthetes, this mapping is flexible (e.g., numbers map onto a circular form if the participant is primed to do so by the appearance of a clock-face), which has been interpreted as a response to task demands (e.g., Bachtold et al., 1998) or as evidence of a linguistically-mediated, rather than a direct, link between number and space (e.g., Proctor and Cho, 2006). We investigated whether synaesthetes’ number forms show the same flexibility during an odd-or-even judgement task that tapped linguistic associations between number and space (following Gevers et al., 2010). Synaesthetes and non-synaesthetes alike mapped small numbers to the verbal label ‘left’ and large numbers to the verbal label ‘right’. This surprising result may indicate that synaesthetes’ number forms are also the result of a linguistic link between number and space, instead of a direct link between the two, or that performance on tasks such as these is not mediated by the number form.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"221-221"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X648477","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Generalization of visual shapes by flexible and simple rules. 用灵活和简单的规则概括视觉形状。
Seeing and Perceiving Pub Date : 2012-01-01 Epub Date: 2011-07-19 DOI: 10.1163/187847511X571519
Bart Ons, Johan Wagemans
{"title":"Generalization of visual shapes by flexible and simple rules.","authors":"Bart Ons,&nbsp;Johan Wagemans","doi":"10.1163/187847511X571519","DOIUrl":"https://doi.org/10.1163/187847511X571519","url":null,"abstract":"<p><p>Rules and similarity are at the heart of our understanding of human categorization. However, it is difficult to distinguish their role as both determinants of categorization are confounded in many real situations. Rules are based on a number of identical properties between objects but these correspondences also make objects appearing more similar. Here, we introduced a stimulus set where rules and similarity were unconfounded and we let participants generalize category examples towards new instances. We also introduced a method based on the frequency distribution of the formed partitions in the stimulus sets, which allowed us to verify the role of rules and similarity in categorization. Our evaluation favoured the rule-based account. The most preferred rules were the simplest ones and they consisted of recurrent visual properties (regularities) in the stimulus set. Additionally, we created different variants of the same stimulus set and tested the moderating influence of small changes in appearance of the stimulus material. A conceptual manipulation (Experiment 1) had no influence but all visual manipulations (Experiment 2 and 3) had strong influences in participants' reliance on particular rules, indicating that prior beliefs of category defining rules are rather flexible.</p>","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 3-4","pages":"237-61"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847511X571519","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"30016545","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Features of the human rod bipolar cell ERG response during fusion of scotopic flicker. 暗闪烁融合过程中人杆状双极细胞ERG反应的特征。
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612x648792
Allison M Cameron, Jacqueline S C Lam
{"title":"Features of the human rod bipolar cell ERG response during fusion of scotopic flicker.","authors":"Allison M Cameron,&nbsp;Jacqueline S C Lam","doi":"10.1163/187847612x648792","DOIUrl":"https://doi.org/10.1163/187847612x648792","url":null,"abstract":"<p><p>The ability of the eye to distinguish between intermittently presented flash stimuli is a measure of the temporal resolution of vision. The aim of this study was to examine the relationship between the features of the human rod bipolar cell response (as measured from the scotopic ERG b-wave) and the psychophysically measured critical fusion frequency (CFF). Stimuli consisted of dim (-0.04 Td x s), blue flashes presented either singly, or as flash pairs (at a range of time separations, between 5 and 300 ms). Single flashes of double intensity (-0.08 Td x s) were also presented as a reference. Visual responses to flash pairs were measured via (1) recording of the ERG b-wave, and (2) threshold determinations of the CFF using a two-alternative forced-choice method (flicker vs. fused illumination). The results of this experiment suggest that b-wave responses to flash pairs separated by < 100 ms are electrophysiologically similar to those obtained with single flashes of double intensity. Psychophysically, the percepts of flash pairs < 100 ms apart appeared fused. In conclusion, the visual system's ability to discriminate between scotopic stimuli may be determined by the response characteristics of the rod bipolar cell, or perhaps by the rod photoreceptor itself.</p>","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 6","pages":"545-60"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612x648792","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40138004","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Combining fiber tracking and functional brain imaging for revealing brain networks involved in auditory–visual integration in humans 结合纤维追踪和功能性脑成像来揭示人类听觉-视觉整合的脑网络
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646280
A. Beer, Tina Plank, Evangelia-Regkina Symeonidou, G. Meyer, M. Greenlee
{"title":"Combining fiber tracking and functional brain imaging for revealing brain networks involved in auditory–visual integration in humans","authors":"A. Beer, Tina Plank, Evangelia-Regkina Symeonidou, G. Meyer, M. Greenlee","doi":"10.1163/187847612X646280","DOIUrl":"https://doi.org/10.1163/187847612X646280","url":null,"abstract":"Previous functional magnetic resonance imaging (MRI) found various brain areas in the temporal and occipital lobe involved in integrating auditory and visual object information. Fiber tracking based on diffusion-weighted MRI suggested neuroanatomical connections between auditory cortex and sub-regions of the temporal and occipital lobe. However, the relationship between functional activity and white-matter tracks remained unclear. Here, we combined probabilistic tracking and functional MRI in order to reveal the structural connections related to auditory–visual object perception. Ten healthy people were examined by diffusion-weighted and functional MRI. During functional examinations they viewed either movies of lip or body movements, listened to corresponding sounds (phonological sounds or body action sounds), or a combination of both. We found that phonological sounds elicited stronger activity in the lateral superior temporal gyrus (STG) than body action sounds. Body movements elicited stronger activity in the lateral occipital cortex than lip movements. Functional activity in the phonological STG region and the lateral occipital body area were mutually modulated (sub-additive) by combined auditory–visual stimulation. Moreover, bimodal stimuli engaged a region in the posterior superior temporal sulcus (STS). Probabilistic tracking revealed white-matter tracks between the auditory cortex and sub-regions of the STS (anterior and posterior) and occipital cortex. The posterior STS region was also found to be relevant for auditory–visual object perception. The anterior STS region showed connections to the phonological STG area and to the lateral occipital body area. Our findings suggest that multisensory networks in the temporal lobe are best revealed by combining functional and structural measures.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"5-5"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646280","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Investigating task and modality switching costs using bimodal stimuli 使用双峰刺激调查任务和模态转换成本
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646451
Rajwant Sandhu, B. Dyson
{"title":"Investigating task and modality switching costs using bimodal stimuli","authors":"Rajwant Sandhu, B. Dyson","doi":"10.1163/187847612X646451","DOIUrl":"https://doi.org/10.1163/187847612X646451","url":null,"abstract":"Investigations of concurrent task and modality switching effects have to date been studied under conditions of uni-modal stimulus presentation. As such, it is difficult to directly compare resultant task and modality switching effects, as the stimuli afford both tasks on each trial, but only one modality. The current study investigated task and modality switching using bi-modal stimulus presentation under various cue conditions: task and modality (double cue), either task or modality (single cue) or no cue. Participants responded to either the identity or the position of an audio–visual stimulus. Switching effects were defined as staying within a modality/task (repetition) or switching into a modality/task (change) from trial n − 1 to trial n, with analysis performed on trial n data. While task and modality switching costs were sub-additive across all conditions replicating previous data, modality switching effects were dependent on the modality being attended, and task switching effects were dependent on the task being performed. Specifically, visual responding and position responding revealed significant costs associated with modality and task switching, while auditory responding and identity responding revealed significant gains associated with modality and task switching. The effects interacted further, revealing that costs and gains associated with task and modality switching varying with the specific combination of modality and task type. The current study reconciles previous data by suggesting that efficiently processed modality/task information benefits from repetition while less efficiently processed information benefits from change due to less interference of preferred processing across consecutive trials.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"22-22"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646451","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
4 year olds localize tactile stimuli using an external frame of reference 4岁儿童使用外部参照系定位触觉刺激
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646631
Jannath Begum, A. Bremner, Dorothy Cowie
{"title":"4 year olds localize tactile stimuli using an external frame of reference","authors":"Jannath Begum, A. Bremner, Dorothy Cowie","doi":"10.1163/187847612X646631","DOIUrl":"https://doi.org/10.1163/187847612X646631","url":null,"abstract":"Adults show a deficit in their ability to localize tactile stimuli to their hands when their arms are in the less familiar, crossed posture (e.g., Overvliet et al., 2011; Shore et al., 2002). It is thought that this ‘crossed-hands effect’ arises due to conflict (when the hands are crossed) between the anatomical and external frames of reference within which touches can be perceived. Pagel et al. (2009) studied this effect in young children and observed that the crossed-hands effect first emerges after 5.5-years. In their task, children were asked to judge the temporal order of stimuli presented across their hands in quick succession. Here, we present the findings of a simpler task in which children were asked to localize a single vibrotactile stimulus presented to either hand. We also compared the effect of posture under conditions in which children either did, or did not, have visual information about current hand posture. With this method, we observed a crossed-hands effect in the youngest age-group testable; 4-year-olds. We conclude that young children localize tactile stimuli with respect to an external frame of reference from early in childhood or before (cf. Bremner et al., 2008). Additionally, when visual information about posture was made available, 4- to 5-year-olds’ tactile localization accuracy in the uncrossed-hands posture deteriorated and the crossed-hands effect disappeared. We discuss these findings with respect to visual–tactile-proprioceptive integration abilities of young children and examine potential sources of the discrepancies between our findings and those of Pagel et al. (2009).","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"41-41"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646631","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426730","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Predictable variations in auditory pitch modulate the spatial processing of visual stimuli: An ERP study 听觉音调的可预测变化调节视觉刺激的空间加工:一项ERP研究
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646488
Fátima Vera-Constán, Irune Fernández-Prieto, Joel García-Morera, J. Navarra
{"title":"Predictable variations in auditory pitch modulate the spatial processing of visual stimuli: An ERP study","authors":"Fátima Vera-Constán, Irune Fernández-Prieto, Joel García-Morera, J. Navarra","doi":"10.1163/187847612X646488","DOIUrl":"https://doi.org/10.1163/187847612X646488","url":null,"abstract":"We investigated whether perceiving predictable ‘ups and downs’ in acoustic pitch (as can be heard in musical melodies) can influence the spatial processing of visual stimuli as a consequence of a ‘spatial recoding’ of sound (see Foster and Zatorre, 2010; Rusconi et al., 2006). Event-related potentials (ERPs) were recorded while participants performed a color discrimination task of a visual target that could appear either above or below a centrally-presented fixation point. Each experimental trial started with an auditory isochronous stream of 11 tones including a high- and a low-pitched tone. The visual target appeared isochronously after the last tone. In the ‘non-predictive’ condition, the tones were presented in an erratic fashion (e.g., ‘high-low-low-high-high-low-high …’). In the ‘predictive condition’, the melodic combination of high- and low-pitched tones was highly predictable (e.g., ‘low-high-low-high-low …’). Within the predictive condition, the visual stimuli appeared congruently or incongruently with respect to the melody (‘… low-high-low-high-low-UP’ or ‘… low-high-low-high-low-DOWN’, respectively). Participants showed faster responses when the visual target appeared after a predictive melody. Electrophysiologically, early (25–150 ms) amplitude effects of predictability were observed in frontal and parietal regions, spreading to central regions (N1) afterwards. Predictability effects were also found in the P2–N2 complex and the P3 in central and parietal regions. Significant auditory-to-visual congruency effects were also observed in the parieto-occipital P3 component. Our findings reveal the existence of crossmodal effects of perceiving auditory isochronous melodies on visual temporal orienting. More importantly, our results suggest that pitch information can be transformed into a spatial code that shapes the spatial processing in other modalities such as vision.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"36 1","pages":"25-25"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646488","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426809","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Assessing audiovisual saliency and visual-information content in the articulation of consonants and vowels on audiovisual temporal perception 在视听时间感知上评估辅音和元音发音的视听显著性和视觉信息内容
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646514
A. Vatakis, C. Spence
{"title":"Assessing audiovisual saliency and visual-information content in the articulation of consonants and vowels on audiovisual temporal perception","authors":"A. Vatakis, C. Spence","doi":"10.1163/187847612X646514","DOIUrl":"https://doi.org/10.1163/187847612X646514","url":null,"abstract":"Research has revealed different temporal integration windows between and within different speech-tokens. The limited speech-tokens tested to date has not allowed for the proper evaluation of whether such differences are task or stimulus driven? We conducted a series of experiments to investigate how the physical differences associated with speech articulation affect the temporal aspects of audiovisual speech perception. Videos of consonants and vowels uttered by three speakers were presented. Participants made temporal order judgments (TOJs) regarding which speech-stream had been presented first. The sensitivity of participants’ TOJs and the point of subjective simultaneity (PSS) were analyzed as a function of the place, manner of articulation, and voicing for consonants, and the height/backness of the tongue and lip-roundedness for vowels. The results demonstrated that for the case of place of articulation/roundedness, participants were more sensitive to the temporal order of highly-salient speech-signals with smaller visual-leads at the PSS. This was not the case when the manner of articulation/height was evaluated. These findings suggest that the visual-speech signal provides substantial cues to the auditory-signal that modulate the relative processing times required for the perception of the speech-stream. A subsequent experiment explored how the presentation of different sources of visual-information modulated such findings. Videos of three consonants were presented under natural and point-light (PL) viewing conditions revealing parts, or the whole, face. Preliminary analysis revealed no differences in TOJ accuracy under different viewing conditions. However, the PSS data revealed significant differences in viewing conditions depending on the speech token uttered (e.g., larger visual-leads for PL-lip/teeth/tongue-only views).","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"29-29"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646514","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426875","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Somatosensory amplification and illusory tactile sensations 体感放大和虚幻触觉
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646569
Vrushant Lakhlani, Kirsten J. McKenzie
{"title":"Somatosensory amplification and illusory tactile sensations","authors":"Vrushant Lakhlani, Kirsten J. McKenzie","doi":"10.1163/187847612X646569","DOIUrl":"https://doi.org/10.1163/187847612X646569","url":null,"abstract":"Experimental studies have demonstrated that it is possible to induce convincing bodily distortions in neurologically healthy individuals, through cross-modal manipulations; such as the rubber hand illusion (Botvinick and Cohen, 1998), the parchment skin illusion (Jousmaki and Hari, 1998) and the Somatic Signal Detection Task (SSDT; Lloyd et al., 2008). It has been shown previously with the SSDT that when a tactile stimulus is presented with a simultaneous light flash, individuals show both increased sensitivity to the tactile stimulus, and the tendency to report feeling the stimulus even when one was not presented; a tendency which varies greatly between individuals but remains constant over time within an individual (McKenzie et al., 2010). Further studies into tactile stimulus discrimination using the Somatic Signal Discrimination Task (SSDiT) have also shown that a concurrent light led to a significant improvement in people’s ability to discriminate ‘weak’ tactile stimuli from ‘strong’ ones, as well as a bias towards reporting any tactile stimulus as ‘strong’ (Poliakoff et al., in preparation), indicating that the light may influence both early and later stages of processing. The current study investigated whether the tendency to report higher numbers of false alarms when carrying out the SSDT is correlated with the tendency to experience higher numbers of cross-modal ‘enhancements’ of weak tactile signals (leading to classifications of ‘weak’ stimuli as strong, and ‘strong’ stimuli as ‘stronger’). Results will be discussed.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"34-34"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646569","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Spatial codes for movement coordination do not depend on developmental vision 运动协调的空间编码并不依赖于发育视觉
Seeing and Perceiving Pub Date : 2012-01-01 DOI: 10.1163/187847612X646721
T. Heed, B. Roeder
{"title":"Spatial codes for movement coordination do not depend on developmental vision","authors":"T. Heed, B. Roeder","doi":"10.1163/187847612X646721","DOIUrl":"https://doi.org/10.1163/187847612X646721","url":null,"abstract":"When people make oscillating right–left movements with their two index fingers while holding their hands palms down, they find it easier to move the fingers symmetrically (i.e., both fingers towards the middle, then both fingers to the outside) than parallel (i.e., both fingers towards the left, then both fingers towards the right). It was originally proposed that this effect is due to concurrent activation of homologous muscles in the two hands. However, symmetric movements are also easier when one of the hands is turned palm up, thus requiring concurrent use of opposing rather than homologous muscles. This was interpreted to indicate that movement coordination relies on perceptual rather than muscle-based information (Mechsner et al., 2001). The current experiment tested whether the spatial code used in this task depends on vision. Participants made either symmetrical or parallel right–left movements with their two index fingers while their palms were either both facing down, both facing up, or one facing up and one down. Neither in sighted nor in congenitally blind participants did movement execution depend on hand posture. Rather, both groups were always more efficient when making symmetrical rather than parallel movements with respect to external space. We conclude that the spatial code used for movement coordination does not crucially depend on vision. Furthermore, whereas congenitally blind people predominately use body-based (somatotopic) spatial coding in perceptual tasks (Roder et al., 2007), they use external spatial codes in movement tasks, with performance indistinguishable from the sighted.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"51-51"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646721","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64427187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信