T. Hullar, Alex K. Malone, Spencer B. Smith, N. N. Chang
{"title":"Migraine, motion sensitivity, and temporal binding","authors":"T. Hullar, Alex K. Malone, Spencer B. Smith, N. N. Chang","doi":"10.1163/187847612X648378","DOIUrl":"https://doi.org/10.1163/187847612X648378","url":null,"abstract":"Little is known about vestibular-related timing processes in patients with disequilibrium. Patients with a history of migraine headaches often complain of significant motion sensitivity and long-term vague imbalance inconsistent with a peripheral vestibular disorder. Some of these people have episodic spells of severe vertigo termed ‘vestibular migraines’. Other patients have no history of migraine but do report significant motion sensitivity. Motion sensitivity has typically been explained as a mismatch between the amplitude of vestibular and other (typically visual) sensory inputs. Another possibility is that motion sensitive patients may suffer from a mismatch in the temporal integration of vestibular and other sensory inputs. Here, we compared the temporal binding window (TBW) of vestibular + auditory stimuli in normal subjects, subjects with motion sensitivity, and those with both migraine and motion sensitivity. We asked subjects undergoing earth-vertical sinusoidal rotations at 0.5 Hz, 128°/s to identify whether a metronome-like series of tone bursts was synchronous with their cyclic motion. We calculated the TBW as the range in time encompassing the middle 68% of the area under the psychometric curve. The TBW in normal subjects was 312 ± 135 ms (mean ± SD), in subjects with motion sensitivity was 454 ± 103 ms, and in subjects with migraine and motion sensitivity was 451 ± 124 ms. The TBW of normal subjects was significantly shorter than the other groups. Temporal errors in perception, as manifested by a prolongation of the TBW, are a plausible mechanism for imbalance in patients with migraine and motion sensitivity.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"209-209"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X648378","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Temporal integration in sound localization via head rotation","authors":"E. Macpherson, Janet K. Kim","doi":"10.1163/187847612X648396","DOIUrl":"https://doi.org/10.1163/187847612X648396","url":null,"abstract":"Information about a sound source’s location in the front/back dimension is present in the relation between head rotation and the resulting changes in interaural time- or level-difference cues. The use of such dynamic cues for localization requires the auditory system to have access to an accurate representation of the orientation and motion of the head in space. We measured, in active and passive rotation conditions, and as a function of head-rotation angle and velocity, normally hearing human listeners’ ability to localize front and rear sources of a low-frequency (0.5–1 kHz) noise band that was not accurately localizable in the absence of head motion. Targets were presented while the head was in motion at velocities of 50–400°/s (active neck rotation) or 25–100°/s (whole-body passive rotation), and were gated on and off as the head passed through a variable-width spatial window. Accuracy increased as window width was increased, which provided access to larger interaural cue changes, but decreased as head-turn velocity increased, which reduced the duration of the stimuli. For both active and passive rotation, these effects were almost exactly reciprocal, such that performance was related primarily to the duration of the stimulus, with ∼100 ms duration required for 75% correct front/back discrimination regardless of the cue-change magnitude or mode of rotation. The efficacy of the dynamic auditory cues in the passive rotation condition suggests that vestibular input is sufficient to inform the auditory system about head motion.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"211-211"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X648396","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428903","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multisensory integration enhances coordination: The necessity of a phasing matching between cross-modal events and movements","authors":"Gregory Zelic, Denis Mottet, J. Lagarde","doi":"10.1163/187847612X648404","DOIUrl":"https://doi.org/10.1163/187847612X648404","url":null,"abstract":"Recent research revealed what substrates may subserve the fascinating capacity of the brain to put together different senses, from single cell to extending networks (see for review, Driver and Noesselt, 2008; Ghazanfar and Schroeder, 2006; Sperdin et al., 2010; Stein and Stanford, 2008), and lead to interesting behavioral benefits in response to cross-modal events such as shorter reaction times, easier detections or more precise synchronization (Diederich and Colonius, 2004; Elliott et al., 2010). But what happens when a combination of multisensory perception and action is required? This is a key issue, since the organization of movements in space–time in harmony with our surrounding environment is the basis of our everdaylife. Surprisingly enough, little is known about how different senses and movement are combined dynamically. Coordination skills allow to test the effectiveness of such a combination, since external events have been shown to stabilize the coordination performance when adequately tuned (Fink et al., 2000). We then tested the modulation of the capacity of participants to produce an anti-symmetric rhythmic bimanual coordination while synchronizing with audio–tactile versus audio and tactile metronomes pacing the coordination from low to high rates of motion. Three condition of metronome structure found to stabilize the anti-symmetric mode have been handled: Simple, Double and Lateralized. We found redundant signal effects for Lateralized metronomes, but not for Simple and Double metronomes, rather explained by neural audio–tactile interactions than by a simple statistical redundancy. These results reflect the effective cortical cooperation between components in charge of the audio–tactile integration and ones sustaining the anti-symmetric coordination pattern. We will discuss the apparent necessity for cross-modal events to match the phasing of movements to greater stabilize the coordination.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"42 1","pages":"212-213"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X648404","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64428936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The aftereffects of ventriloquism: the time course of the visual recalibration of auditory localization.","authors":"Ilja Frissen, Jean Vroomen, Beatrice de Gelder","doi":"10.1163/187847611X620883","DOIUrl":"https://doi.org/10.1163/187847611X620883","url":null,"abstract":"<p><p>Exposure to synchronous but spatially discordant auditory and visual inputs produces adaptive recalibration of the respective localization processes, which manifest themselves in measurable aftereffects. Here we report two experiments that examined the time course of visual recalibration of apparent sound location in order to establish the build-up and dissipation of recalibration. In Experiment 1 participants performed a sound localization task before and during exposure to an auditory-visual discrepancy. In Experiment 2, participants performed a sound localization task before and after 60, 180 or 300 exposures to the discrepancy and aftereffects were measured across a series of post-adaptation sound localization trials. The results show that recalibration is very fast. Substantial aftereffects are obtained after only 18-24 exposures and asymptote appears to be reached between 60 and 180 exposures. The rate of adaptation was independent of the size of the discrepancy. The retention of the aftereffect was strong, as we found no dissipation, not even after as few as 60 exposure trials.</p>","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"1-14"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847611X620883","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"30477176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"New laws of simultaneous contrast?","authors":"Vebjørn Ekroll, Franz Faul","doi":"10.1163/187847612X626363","DOIUrl":"https://doi.org/10.1163/187847612X626363","url":null,"abstract":"<p><p>Drawing on many seemingly disparate and unrelated lines of evidence, we argue that the direction of the simultaneous contrast effect in three-dimensional colour space is given by the difference vector between target and surround ('direction hypothesis'). This challenges the traditional idea according to which the direction of the simultaneous contrast effect is complementary to the colour of the surround ('complementarity law'). We also argue that the size of the simultaneous contrast effect is either constant or decreases with the difference between target and surround in three-dimensional colour space. The latter proposal challenges Kirschmann's fourth law. Within our theoretical framework, the universally presumed validity of the complementarity law and Kirschmann's fourth law can be understood as resulting from the failure to take various confounding factors into account when interpreting empirical data, the most prominent of which is the influence of temporal von Kries adaptation.</p>","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 2","pages":"107-41"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X626363","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"30491563","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tactile picture recognition: errors are in shape acquistion or object matching?","authors":"Amy A Kalia, Pawan Sinha","doi":"10.1163/187847511X584443","DOIUrl":"https://doi.org/10.1163/187847511X584443","url":null,"abstract":"<p><p>Numerous studies have demonstrated that sighted and blind individuals find it difficult to recognize tactile pictures of common objects. However, it is still not clear what makes recognition of tactile pictures so difficult. One possibility is that observers have difficulty acquiring the global shape of the image when feeling it. Alternatively, observers may have an accurate understanding of the shape but are unable to link it to a particular object representation. We, therefore, conducted two experiments to determine where tactile picture recognition goes awry. In Experiment 1, we found that recognition of tactile pictures by blindfolded sighted observers correlated with image characteristics that affect shape acquisition (symmetry and complexity). In Experiment 2, we asked drawing experts to draw what they perceived after feeling the images. We found that the experts produced three types of drawings when they could not recognize the tactile pictures: (1) drawings that did not look like objects (incoherent), (2) drawings that looked like incorrect objects (coherent but inaccurate) and (3) drawings that looked like the correct objects ( coherent and accurate). The majority of errors seemed to result from inaccurate perception of the global shape of the image (error types 1 and 2). Our results suggest that recognition of simplistic tactile pictures of objects is largely inhibited by low-level tactile shape processing rather than high-level object recognition mechanisms.</p>","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 3-4","pages":"287-302"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847511X584443","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"30102230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dissociable crossmodal recruitment of visual and auditory cortex for tactile perception","authors":"J. Yau, P. Celnik, S. Hsiao, J. Desmond","doi":"10.1163/187847612X646307","DOIUrl":"https://doi.org/10.1163/187847612X646307","url":null,"abstract":"Primary sensory areas previously thought to be devoted to a single modality can exhibit multisensory responses. Some have interpreted these responses as evidence for crossmodal recruitment (i.e., primary sensory processing for inputs in a non-primary modality); however, the direct contribution of this activity to perception is unclear. We tested the specific contributions of visual and auditory cortex to tactile perception in healthy adult volunteers using anodal transcranial direct current stimulation (tDCS). This form of non-invasive neuromodulation can enhance neural excitability and facilitate learning. In a series of psychophysical experiments we characterized participants’ ability to discriminate grating orientation or vibration frequency. We measured perceptual sensitivity before, during, and after tDCS application over either visual cortex or auditory cortex. Each participant received both anodal and sham interventions on separate sessions in counterbalanced order. We found that anodal stimulation over visual cortex selectively improved tactile spatial acuity, but not frequency sensitivity. Conversely, anodal stimulation over auditory cortex selectively improved tactile frequency sensitivity, but not spatial acuity. Furthermore, we found that improvements in tactile perception persisted after cessation of tDCS. These results reveal a clear double-dissociation in the crossmodal contributions of visual and auditory cortex to tactile perception, and support a supramodal brain organization scheme in which visual and auditory cortex comprise distributed networks that support shape and frequency perception, independent of sensory input modality.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"7-7"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646307","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Audiovisual stimulus-driven contributions to spatial orienting in ecologically valid situations: An fMRI study","authors":"D. Nardo, Valerio Santangelo, E. Macaluso","doi":"10.1163/187847612X646389","DOIUrl":"https://doi.org/10.1163/187847612X646389","url":null,"abstract":"Mechanisms of audiovisual attention have been extensively investigated, yet little is known about their functioning in ecologically-valid situations. Here, we investigated brain activity associated with audiovisual stimulus-driven attention using naturalistic stimuli. We created 120 short videos (2.5 s) showing scenes of everyday life. Each video included a visual event comprising a lateralized (left/right) increase in visual saliency (e.g., an actor moving an object), plus a co-occurring sound either on the same or the opposite side of space. Subjects viewed the videos with/without the associated sounds, and either in covert (central fixation) or overt (eye-movements allowed) viewing conditions. For each stimulus, we used computational models (‘saliency maps’) to characterize auditory and visual stimulus-driven signals, and eye-movements (recorded in free viewing) as a measure of the efficacy of these signals for spatial orienting. Results showed that visual saliency modulated activity in the occipital cortex contralateral to the visual event; while auditory saliency modulated activity in the superior temporal gyrus bilaterally. In the posterior parietal cortex activity increased with increasing auditory saliency, but only when the auditory stimulus was on the same side as the visual event. The efficacy of the stimulus-driven signals modulated activity in the visual cortex. We conclude that: (1) audiovisual attention can be successfully investigated in real-like situations; (2) activity in sensory areas reflects a combination of stimulus-driven signals (saliency) and their efficacy for spatial orienting; (3) posterior parietal cortex processes auditory input as a function of its spatial relationship with the visual input.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"16-16"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646389","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426454","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Predicting multisensory enhancement in neuronal responses","authors":"B. Rowland","doi":"10.1163/187847612X646253","DOIUrl":"https://doi.org/10.1163/187847612X646253","url":null,"abstract":"The most dramatic physiological example of multisensory integration is response enhancement, where the integration of concordant signals across multiple sensory modalities leads to a larger and more reliable response. In the model system of the superior colliculus, the largest enhancements (often greater than the predicted sum) are observed when the individual signals being combined are weak. This principle conforms to expectations based on signal detection theory, and also as expected, enhancement is not uniform throughout any response. Typically it is greatest near its onset, when the unisensory inputs are at their weakest (Initial Response Enhancement, see Rowland et al., 2007; Rowland and Stein, 2008). Despite the general accuracy of this heuristic, however, there is a substantial amount of variance in the degree of observed enhancement at all levels of responsiveness. This observation appears to violate standard Bayesian predictions that are based on overall response magnitude. Aside from statistical noise, a possible explanation is that individual neurons in the dataset are calibrated to different ‘computational modes’. An alternative hypothesis is that the amount of enhancement is influenced greatly by response properties other than magnitude, specifically, the temporal profile of the response. The present analysis advances the latter hypothesis. We present a mechanistic framework that explains these findings and extends the standard Bayesian approach to generate an accurate prediction for the multisensory response profile given known unisensory response profiles. These predictions offer a ‘null hypothesis’ that can be used to quantify the circumstances and timing of anomalies in the integrative processes in different experimental conditions; for example, when it is developing under different conditions, or when it is disrupted by experimental or surgical intervention at any stage of life.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"3-3"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646253","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The effect of vision on tactile TOJ during arm crossing","authors":"Makoto Wada, K. Kansaku","doi":"10.1163/187847612X646578","DOIUrl":"https://doi.org/10.1163/187847612X646578","url":null,"abstract":"When people cross their arms, subjective rank ordering of successive unseen tactile stimuli delivered to both arms is affected (often being reversed) (Shore et al., 2002; Yamamoto and Kitazawa, 2001). It is also known that vision plays a significant role in modulating perceived limb position (Graziano et al., 2000). In this study, we examined the effect of vision; i.e., eyes opening and closing on tactile temporal order judgment (TOJ) with their arms crossed or uncrossed. In a psychophysical experiment, participants ( n = 18 , 13 males, 27.3 ± 1.8 y.o.) were required to judge temporal order of two tactile stimuli that were delivered to their both ring fingers with four conditions: uncrossed arms with eyes closed, crossed arms with eyes closed, uncrossed arms with eyes open and crossed arms with eyes open. To evaluate judgment probabilities of the participants, degree of reversals of their judgment was calculated as the sum of differences between correct response rates of the arms crossed condition and those of the arms uncrossed condition. In arms uncrossed conditions, judgment probabilities of the participants were not significantly different between eyes closed and open conditions. In contrast, reversal of the judgment with eyes closed was significantly larger than that with eyes open in arms crossed conditions ( p < 0 . 05 ). The results suggest that vision play a significant role in tactile order judgment when the subject arms crossed.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"35-35"},"PeriodicalIF":0.0,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646578","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"64426974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}