Precision in spatial working memory examined with mouse pointing

IF 1.5 4区 心理学 Q4 NEUROSCIENCES
Siobhan M. McAteer, Anthony McGregor, Daniel T. Smith
{"title":"Precision in spatial working memory examined with mouse pointing","authors":"Siobhan M. McAteer,&nbsp;Anthony McGregor,&nbsp;Daniel T. Smith","doi":"10.1016/j.visres.2023.108343","DOIUrl":null,"url":null,"abstract":"<div><p>The capacity of visuospatial working memory (VSWM) is limited. However, there is continued debate surrounding the nature of this capacity limitation. The resource model (<span>Bays et al., 2009</span>) proposes that VSWM capacity is limited by the precision with which visuospatial features can be retained. In one of the few studies of spatial working memory, <span>Schneegans and Bays (2016)</span> report that memory guided pointing responses show a monotonic decrease in precision as set size increases, consistent with resource models. Here we report two conceptual replications of this study that use mouse responses rather than pointing responses. Overall results are consistent with the resource model, as there was an exponential increase in localisation error and monotonic increases in the probability of misbinding and guessing with increases in set size. However, an unexpected result of Experiment One was that, unlike <span>Schneegans and Bays (2016)</span>, imprecision did not increase between set sizes of 2 and 8. Experiment Two replicated this effect and ruled out the possibility that the invariance of imprecision at set sizes greater than 2 was a product of oculomotor strategies during recall. We speculate that differences in imprecision are related to additional visuomotor transformations required for memory-guided mouse localisation compared to memory-guided manual pointing localisation. These data demonstrate the importance of considering the nature of the response modality when interpreting VSWM data.</p></div>","PeriodicalId":23670,"journal":{"name":"Vision Research","volume":"215 ","pages":"Article 108343"},"PeriodicalIF":1.5000,"publicationDate":"2023-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0042698923001670/pdfft?md5=73a5523ff4d94e5f153b798860d10f9a&pid=1-s2.0-S0042698923001670-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Vision Research","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0042698923001670","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

The capacity of visuospatial working memory (VSWM) is limited. However, there is continued debate surrounding the nature of this capacity limitation. The resource model (Bays et al., 2009) proposes that VSWM capacity is limited by the precision with which visuospatial features can be retained. In one of the few studies of spatial working memory, Schneegans and Bays (2016) report that memory guided pointing responses show a monotonic decrease in precision as set size increases, consistent with resource models. Here we report two conceptual replications of this study that use mouse responses rather than pointing responses. Overall results are consistent with the resource model, as there was an exponential increase in localisation error and monotonic increases in the probability of misbinding and guessing with increases in set size. However, an unexpected result of Experiment One was that, unlike Schneegans and Bays (2016), imprecision did not increase between set sizes of 2 and 8. Experiment Two replicated this effect and ruled out the possibility that the invariance of imprecision at set sizes greater than 2 was a product of oculomotor strategies during recall. We speculate that differences in imprecision are related to additional visuomotor transformations required for memory-guided mouse localisation compared to memory-guided manual pointing localisation. These data demonstrate the importance of considering the nature of the response modality when interpreting VSWM data.

用鼠标指点检查空间工作记忆的精确度
视觉空间工作记忆(VSWM)的容量是有限的。然而,关于这种容量限制的本质一直存在争议。资源模型(Bays 等人,2009 年)提出,视觉空间工作记忆的容量受限于视觉空间特征的保留精度。在为数不多的关于空间工作记忆的研究中,Schneegans 和 Bays(2016 年)报告说,记忆引导的指向反应随着集合大小的增加,精确度呈现单调下降,这与资源模型是一致的。在这里,我们报告了这项研究的两个概念性重复,使用的是鼠标反应而不是指向反应。总体结果与资源模型相一致,因为随着集合大小的增加,定位误差呈指数增长,而错误绑定和猜测的概率则呈单调增长。然而,实验一的一个意外结果是,与 Schneegans 和 Bays(2016 年)不同的是,在集合大小为 2 和 8 之间,不精确度并没有增加。实验二复制了这一结果,并排除了在集合大小大于 2 时不精确度不变是回忆过程中眼球运动策略产物的可能性。我们推测,与记忆引导的手动指向定位相比,不精确度的差异与记忆引导的小鼠定位所需的额外视觉运动转换有关。这些数据表明,在解释 VSWM 数据时,考虑反应模式的性质非常重要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Vision Research
Vision Research 医学-神经科学
CiteScore
3.70
自引率
16.70%
发文量
111
审稿时长
66 days
期刊介绍: Vision Research is a journal devoted to the functional aspects of human, vertebrate and invertebrate vision and publishes experimental and observational studies, reviews, and theoretical and computational analyses. Vision Research also publishes clinical studies relevant to normal visual function and basic research relevant to visual dysfunction or its clinical investigation. Functional aspects of vision is interpreted broadly, ranging from molecular and cellular function to perception and behavior. Detailed descriptions are encouraged but enough introductory background should be included for non-specialists. Theoretical and computational papers should give a sense of order to the facts or point to new verifiable observations. Papers dealing with questions in the history of vision science should stress the development of ideas in the field.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信