Non-symbolic estimation of big and small ratios with accurate and noisy feedback

IF 1.7 4区 心理学 Q3 PSYCHOLOGY
Nicola J. Morton, Matt Grice, Simon Kemp, Randolph C. Grace
{"title":"Non-symbolic estimation of big and small ratios with accurate and noisy feedback","authors":"Nicola J. Morton,&nbsp;Matt Grice,&nbsp;Simon Kemp,&nbsp;Randolph C. Grace","doi":"10.3758/s13414-024-02914-6","DOIUrl":null,"url":null,"abstract":"<div><p>The ratio of two magnitudes can take one of two values depending on the order they are operated on: a ‘big’ ratio of the larger to smaller magnitude, or a ‘small’ ratio of the smaller to larger. Although big and small ratio scales have different metric properties and carry divergent predictions for perceptual comparison tasks, no psychophysical studies have directly compared them. Two experiments are reported in which subjects implicitly learned to compare pairs of brightnesses and line lengths by non-symbolic feedback based on the scaled big ratio, small ratio or difference of the magnitudes presented. Results of Experiment 1 showed all three operations were learned quickly and estimated with a high degree of accuracy that did not significantly differ across groups or between intensive and extensive modalities, though regressions on individual data suggested an overall predisposition towards differences. Experiment 2 tested whether subjects learned to estimate the operation trained or to associate stimulus pairs with correct responses. For each operation, Gaussian noise was added to the feedback that was constant for repetitions of each pair. For all subjects, coefficients for the added noise component were negative when entered in a regression model alongside the trained differences or ratios, and were statistically significant in 80% of individual cases. Thus, subjects learned to estimate the comparative operations and effectively ignored or suppressed the added noise. These results suggest the perceptual system is highly flexible in its capacity for non-symbolic computation, which may reflect a deeper connection between perceptual structure and mathematics.</p></div>","PeriodicalId":55433,"journal":{"name":"Attention Perception & Psychophysics","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2024-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11410853/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Attention Perception & Psychophysics","FirstCategoryId":"102","ListUrlMain":"https://link.springer.com/article/10.3758/s13414-024-02914-6","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

The ratio of two magnitudes can take one of two values depending on the order they are operated on: a ‘big’ ratio of the larger to smaller magnitude, or a ‘small’ ratio of the smaller to larger. Although big and small ratio scales have different metric properties and carry divergent predictions for perceptual comparison tasks, no psychophysical studies have directly compared them. Two experiments are reported in which subjects implicitly learned to compare pairs of brightnesses and line lengths by non-symbolic feedback based on the scaled big ratio, small ratio or difference of the magnitudes presented. Results of Experiment 1 showed all three operations were learned quickly and estimated with a high degree of accuracy that did not significantly differ across groups or between intensive and extensive modalities, though regressions on individual data suggested an overall predisposition towards differences. Experiment 2 tested whether subjects learned to estimate the operation trained or to associate stimulus pairs with correct responses. For each operation, Gaussian noise was added to the feedback that was constant for repetitions of each pair. For all subjects, coefficients for the added noise component were negative when entered in a regression model alongside the trained differences or ratios, and were statistically significant in 80% of individual cases. Thus, subjects learned to estimate the comparative operations and effectively ignored or suppressed the added noise. These results suggest the perceptual system is highly flexible in its capacity for non-symbolic computation, which may reflect a deeper connection between perceptual structure and mathematics.

Abstract Image

利用准确的噪声反馈对大比率和小比率进行非符号估算。
两个量级的比值可以有两种,取决于它们的操作顺序:较大量级与较小量级的 "大 "比值,或较小量级与较大量级的 "小 "比值。尽管大比例尺度和小比例尺度具有不同的度量特性,并且在知觉比较任务中具有不同的预测结果,但还没有心理物理研究对它们进行过直接比较。本文报告了两个实验,在这两个实验中,受试者通过非符号反馈,根据呈现的大小比、小比或差的比例,潜移默化地学会了比较一对亮度和线的长度。实验 1 的结果表明,受试者很快就学会了所有这三种操作,而且估算的准确度很高,不同组别之间、密集模式和广泛模式之间没有显著差异,但对个体数据的回归表明,总体上存在差异倾向。实验 2 测试了受试者是学会了估计训练过的操作,还是学会了将刺激对与正确反应联系起来。对于每种操作,反馈中都添加了高斯噪声,这种噪声在每对刺激的重复中都是恒定的。对所有被试来说,如果将添加的噪声成分与训练过的差异或比率一起输入回归模型,其系数都是负的,并且在 80% 的个别情况下具有统计学意义。因此,受试者学会了估计比较操作,并有效地忽略或抑制了附加噪声。这些结果表明,知觉系统具有高度灵活的非符号计算能力,这可能反映了知觉结构与数学之间更深层次的联系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
3.60
自引率
17.60%
发文量
197
审稿时长
4-8 weeks
期刊介绍: The journal Attention, Perception, & Psychophysics is an official journal of the Psychonomic Society. It spans all areas of research in sensory processes, perception, attention, and psychophysics. Most articles published are reports of experimental work; the journal also presents theoretical, integrative, and evaluative reviews. Commentary on issues of importance to researchers appears in a special section of the journal. Founded in 1966 as Perception & Psychophysics, the journal assumed its present name in 2009.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信