{"title":"Exploring perceptual grouping by proximity principle in multistable dot lattices: Dissociation between vision-for-perception and vision-for-action","authors":"Hamze Moazzen, Shahriar Gharibzadeh, Fatemeh Bakouie","doi":"10.3758/s13414-024-02928-0","DOIUrl":"10.3758/s13414-024-02928-0","url":null,"abstract":"<div><p>Perceptual grouping, a fundamental mechanism in our visual system, significantly influences our interpretation of and interaction with the surrounding world. This study explores the impact of the proximity principle from the perspective of the Two Visual Systems (TVS) model. The TVS model argues that the visual system comprises two distinct streams: the ventral stream, which forms the neural basis for “vision-for-perception,” and the dorsal stream, which underlies “vision-for-action.” We designed a perceptual grouping task using dot lattices as well as a line-orientation discrimination task. Data were collected using vocal and mouse methods for the vision-for-perception mode, and joystick and pen-paper methods for the vision-for-action mode. Each method, except for vocal, included separate blocks for right and left hands. The proximity data were fitted using exponential and power models. Linear mixed-effects models were used for the statistical analyses. The results revealed similar line-orientation discrimination accuracy across all conditions. The exponential model emerged as the best fit, demonstrating adherence to the Pure Distance Law in both perceptual modes. Sensitivity to the proximity principle was higher in the vision-for-action mode compared to the vision-for-perception. In terms of orientation biases, a strong preference for vertical orientation was observed in the vision-for-perception mode, whereas a noticeable preference toward either of the oblique orientations was detected in the vision-for-action mode. Analysis of free-drawn lines demonstrated an affordance bias in the vision-for-action mode. This suggests a remarkable tendency to perceive organizations within specific orientations that offer more affordances due to the interaction between the body postures and tools.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"2053 - 2077"},"PeriodicalIF":1.7,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141876823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Timbral brightness perception investigated through multimodal interference","authors":"Charalampos Saitis, Zachary Wallmark","doi":"10.3758/s13414-024-02934-2","DOIUrl":"10.3758/s13414-024-02934-2","url":null,"abstract":"<div><p>Brightness is among the most studied aspects of timbre perception. Psychoacoustically, sounds described as “bright” versus “dark” typically exhibit a high versus low frequency emphasis in the spectrum. However, relatively little is known about the neurocognitive mechanisms that facilitate these <i>metaphors we listen with</i>. Do they originate in universal magnitude representations common to more than one sensory modality? Triangulating three different interaction paradigms, we investigated using speeded classification whether intramodal, crossmodal, and amodal interference occurs when timbral brightness, as modeled by the centroid of the spectral envelope, and pitch height/visual brightness/numerical value processing are semantically congruent and incongruent. In four online experiments varying in priming strategy, onset timing, and response deadline, 189 total participants were presented with a baseline stimulus (a pitch, gray square, or numeral) then asked to quickly identify a target stimulus that is higher/lower, brighter/darker, or greater/less than the baseline after being primed with a bright or dark synthetic harmonic tone. Results suggest that timbral brightness modulates the perception of pitch and possibly visual brightness, but not numerical value. Semantically incongruent pitch height-timbral brightness shifts produced significantly slower reaction time (RT) and higher error compared to congruent pairs. In the visual task, incongruent pairings of gray squares and tones elicited slower RTs than congruent pairings (in two experiments). No interference was observed in the number comparison task. These findings shed light on the embodied and multimodal nature of experiencing timbre.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"1835 - 1845"},"PeriodicalIF":1.7,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11410849/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141876825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Phonetic categorization in phonological lexical neighborhoods: Facilitatory and inhibitory effects","authors":"Yubin Zhang","doi":"10.3758/s13414-024-02931-5","DOIUrl":"10.3758/s13414-024-02931-5","url":null,"abstract":"<div><p>Phonetic processing, whereby the bottom-up speech signal is translated into higher-level phonological representations such as phonemes, has been demonstrated to be influenced by phonological lexical neighborhoods. Previous studies show facilitatory effects of lexicality and phonological neighborhood density on phonetic categorization. However, given the evidence for lexical competition in spoken word recognition, we hypothesize that there are concurrent facilitatory and inhibitory effects of phonological lexical neighborhoods on phonetic processing. In Experiments 1 and 2, participants categorized the onset phoneme in word-nonword and nonword-word acoustic continua. The results show that the target word of the continuum exhibits facilitatory lexical influences whereas rhyme neighbors inhibit phonetic categorization. The results support the hypothesis that sublexical phonetic processing is affected by multiple facilitatory and inhibitory lexical forces in the processing stream.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"2136 - 2152"},"PeriodicalIF":1.7,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11410893/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141876824","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Automatic imitation is modulated by stimulus clarity but not by animacy","authors":"Hannah Wilt, Yuchunzi Wu, Antony Trotter, Patti Adank","doi":"10.3758/s13414-024-02935-1","DOIUrl":"10.3758/s13414-024-02935-1","url":null,"abstract":"<div><p>Observing actions evokes an automatic imitative response that activates mechanisms required to execute these actions. Automatic imitation is measured using the Stimulus Response Compatibility (SRC) task, which presents participants with compatible and incompatible prompt-distractor pairs. Automatic imitation, or the <i>compatibility effect</i>, is the difference in response times (RTs) between incompatible and compatible trials. Past results suggest that an action’s animacy affects automatic imitation: human-produced actions evoke larger effects than computer-generated actions. However, it appears that animacy effects occur mostly when non-human stimuli are less complex or less clear. Theoretical accounts make conflicting predictions regarding both stimulus manipulations. We conducted two SRC experiments that presented participants with an animacy manipulation (human and computer-generated stimuli, Experiment 1) and a clarity manipulation (stimuli with varying visual clarity using Gaussian blurring, Experiments 1 and 2) to tease apart effect of these manipulations. Participants in Experiment 1 responded slower for incompatible than for compatible trials, showing a compatibility effect. Experiment 1 found a null effect of animacy, but stimuli with lower visual clarity evoked smaller compatibility effects. Experiment 2 modulated clarity in five steps and reports decreasing compatibility effects for stimuli with lower clarity. Clarity, but not animacy, therefore affected automatic imitation, and theoretical implications and future directions are considered.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"2078 - 2092"},"PeriodicalIF":1.7,"publicationDate":"2024-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11411005/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141861686","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Size adaptation: Do you know it when you see it?","authors":"Sami R. Yousif, Sam Clarke","doi":"10.3758/s13414-024-02925-3","DOIUrl":"10.3758/s13414-024-02925-3","url":null,"abstract":"<div><p>The visual system adapts to a wide range of visual features, from lower-level features like color and motion to higher-level features like causality and, perhaps, number. According to some, adaptation is a strictly perceptual phenomenon, such that the presence of adaptation licenses the claim that a feature is truly perceptual in nature. Given the theoretical importance of claims about adaptation, then, it is important to understand exactly when the visual system does and does not exhibit adaptation. Here, we take as a case study one specific kind of adaptation: visual adaptation to <i>size</i>. Supported by evidence from four experiments, we argue that, despite robust effects of size adaptation in the lab, (1) size adaptation effects are phenomenologically underwhelming (in some cases, hardly appreciable at all), (2) some effects of size adaptation appear contradictory, and difficult to explain given current theories of size adaptation, and (3) prior studies on size adaptation may have failed to isolate size as the adapted dimension. Ultimately, we argue that while there is evidence to license the claim that size adaptation is genuine, size adaptation is a puzzling and poorly understood phenomenon.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"1923 - 1937"},"PeriodicalIF":1.7,"publicationDate":"2024-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11410845/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141794129","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Investigating an effort avoidance account of attentional strategy choice","authors":"Tianyu Zhang, Andrew B. Leber","doi":"10.3758/s13414-024-02927-1","DOIUrl":"10.3758/s13414-024-02927-1","url":null,"abstract":"<div><p>People often choose suboptimal attentional control strategies during visual search. This has been at least partially attributed to the avoidance of the cognitive effort associated with the optimal strategy, but aspects of the task triggering such avoidance remain unclear. Here, we attempted to measure effort avoidance of an isolated task component to assess whether this component might drive suboptimal behavior. We adopted a modified version of the Adaptive Choice Visual Search (ACVS), a task designed to measure people’s visual search strategies. To perform optimally, participants must make a numerosity judgment—estimating and comparing two color sets—before they can advantageously search through the less numerous of the two. If participants skip the numerosity judgment step, they can still perform accurately, albeit substantially more slowly. To study whether effort associated with performing the optional numerosity judgment could be an obstacle to optimal performance, we created a variant of the demand selection task to quantify the avoidance of numerosity judgment effort. Results revealed a robust avoidance of the numerosity judgment, offering a potential explanation for why individuals choose suboptimal strategies in the ACVS task. Nevertheless, we did not find a significant relationship between individual numerosity judgment avoidance and ACVS optimality, and we discussed potential reasons for this lack of an observed relationship. Altogether, our results showed that the effort avoidance for specific subcomponents of a visual search task can be probed and linked to overall strategy choices.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"1989 - 2002"},"PeriodicalIF":1.7,"publicationDate":"2024-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11411006/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141768033","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The speed and phase of locomotion dictate saccade probability and simultaneous low-frequency power spectra.","authors":"Lydia Barnes, Matthew J Davidson, David Alais","doi":"10.3758/s13414-024-02932-4","DOIUrl":"https://doi.org/10.3758/s13414-024-02932-4","url":null,"abstract":"<p><p>Every day we make thousands of saccades and take thousands of steps as we explore our environment. Despite their common co-occurrence in a typical active state, we know little about the coordination between eye movements, walking behaviour and related changes in cortical activity. Technical limitations have been a major impediment, which we overcome here by leveraging the advantages of an immersive wireless virtual reality (VR) environment with three-dimensional (3D) position tracking, together with simultaneous recording of eye movements and mobile electroencephalography (EEG). Using this approach with participants engaged in unencumbered walking along a clear, level path, we find that the likelihood of eye movements at both slow and natural walking speeds entrains to the rhythm of footfall, peaking after the heel-strike of each step. Compared to previous research, this entrainment was captured in a task that did not require visually guided stepping - suggesting a persistent interaction between locomotor and visuomotor functions. Simultaneous EEG recordings reveal a concomitant modulation entrained to heel-strike, with increases and decreases in oscillatory power for a broad range of frequencies. The peak of these effects occurred in the theta and alpha range for slow and natural walking speeds, respectively. Together, our data show that the phase of the step-cycle influences other behaviours such as eye movements, and produces related modulations of simultaneous EEG following the same rhythmic pattern. These results reveal gait as an important factor to be considered when interpreting saccadic and time-frequency EEG data in active observers, and demonstrate that saccadic entrainment to gait may persist throughout everyday activities.</p>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":" ","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141762852","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zoe Jing Xu, Simona Buetti, Yan Xia, Alejandro Lleras
{"title":"Skills and cautiousness predict performance in difficult search","authors":"Zoe Jing Xu, Simona Buetti, Yan Xia, Alejandro Lleras","doi":"10.3758/s13414-024-02923-5","DOIUrl":"10.3758/s13414-024-02923-5","url":null,"abstract":"<div><p>People differ in how well they search. What are the factors that might contribute to this variability? We tested the contribution of two cognitive abilities: visual working memory (VWM) capacity and object recognition ability. Participants completed three tasks: a difficult inefficient visual search task, where they searched for a target letter <i>T</i> among skewed <i>L</i> distractors; a VWM task, where they memorized a color array and then identified whether a probed color belonged to the previous array; and the Novel Object Memory Test (NOMT), where they learnt complex novel objects and then identified them amongst objects that closely resembled them. Exploratory and confirmatory factor analyses revealed that there are two latent factors that explain the shared variance among these three tasks: a factor indicative of the level of caution participants exercised during the challenging visual search task, and a factor representing their visual cognitive abilities. People who score high on the search cautiousness tend to perform a more accurate but slower search. People who score high on the visual cognitive ability factor tend to have a higher VWM capacity, a better object recognition ability, and a faster search speed. The results reflect two points: (1) Visual search tasks share components with visual working memory and object recognition tasks. (2) Search performance is influenced not only by the search display’s properties but also by individual predispositions such as caution and general visual abilities. This study introduces new factors for consideration when interpreting variations in visual search behaviors.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"1897 - 1912"},"PeriodicalIF":1.7,"publicationDate":"2024-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141602230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Xiaolin Mei, Shuyuan Chen, Xinyi Xia, Bo Yang, Yanping Liu
{"title":"Neural correlates for word-frequency effect in Chinese natural reading.","authors":"Xiaolin Mei, Shuyuan Chen, Xinyi Xia, Bo Yang, Yanping Liu","doi":"10.3758/s13414-024-02894-7","DOIUrl":"https://doi.org/10.3758/s13414-024-02894-7","url":null,"abstract":"<p><p>Word frequency effect has always been of interest for reading research because of its critical role in exploring mental processing underlying reading behaviors. Access to word frequency information has long been considered an indicator of the beginning of lexical processing and the most sensitive marker for studying when the brain begins to extract semantic information Sereno & Rayner, Brain and Cognition, 42, 78-81, (2000), Trends in Cognitive Sciences, 7, 489-493, (2003). While the word frequency effect has been extensively studied in numerous eye-tracking and traditional EEG research using the RSVP paradigm, there is a lack of corresponding evidence in studies of natural reading. To find the neural correlates of the word frequency effect, we conducted a study of Chinese natural reading using EEG and eye-tracking coregistration to examine the time course of lexical processing. Our results reliably showed that the word frequency effect first appeared in the N200 time window and the bilateral occipitotemporal regions. Additionally, the word frequency effect was reflected in the N400 time window, spreading from the occipital region to the central parietal and frontal regions. Our current study provides the first neural correlates for word-frequency effect in natural Chinese reading so far, shedding new light on understanding lexical processing in natural reading and could serve as an important basis for further reading study when considering neural correlates in a realistic manner.</p>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":" ","pages":""},"PeriodicalIF":1.7,"publicationDate":"2024-07-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141592153","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Invariant contexts reduce response time variability in visual search in an age-specific way: A comparison of children, teenagers, and adults","authors":"Chengyu Fan, Artyom Zinchenko, Lihan Chen, Jiao Wu, Yeke Qian, Xuelian Zang","doi":"10.3758/s13414-024-02926-2","DOIUrl":"10.3758/s13414-024-02926-2","url":null,"abstract":"<div><p>Contextual cueing is a phenomenon in which repeatedly encountered arrays of items can enhance the visual search for a target item. This is widely attributed to attentional guidance driven by contextual memory acquired during visual search. Some studies suggest that children may have an immature ability to use contextual cues compared to adults, while others argue that contextual learning capacity is similar across ages. To test the development of context-guided attention, this study compared contextual cueing effects among three age groups: adults (aged 18–33 years, <i>N</i> = 32), teenagers (aged 15–17 years, <i>N</i> = 41), and younger children (aged 8–9 years, <i>N</i> = 43). Moreover, this study introduced a measure of response time variability that tracks fluctuations in response time throughout the experiment, in addition to the conventional analysis of response times. The results showed that all age groups demonstrated significantly faster responses in repeated than non-repeated search contexts. Notably, adults and teenagers exhibited smaller response time variability in repeated contexts than in non-repeated ones, while younger children did not. This implies that children are less efficient at consolidating contextual information into a stable memory representation, which may lead to less stable attentional guidance during visual search.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":"86 6","pages":"1974 - 1988"},"PeriodicalIF":1.7,"publicationDate":"2024-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141592152","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}