Simon Bang Kristensen , Katrine Bødkergaard , Bo Martin Bibby
{"title":"Analysing the bias introduced by adaptive designs to estimates of psychometric functions","authors":"Simon Bang Kristensen , Katrine Bødkergaard , Bo Martin Bibby","doi":"10.1016/j.jmp.2025.102899","DOIUrl":null,"url":null,"abstract":"<div><div>An adaptive design adjusts dynamically as information is accrued. In psychometrics and psychophysics, a class of studies investigates a subject’s ability to perform tasks as a function of the stimulus intensity, ie the amount or clarity of information supplied for the task. The relationship between performance and intensity is represented by a psychometric function. Such experiments routinely apply adaptive designs using both previous intensities and performance to assign stimulus intensities, the strategy being to sample intensities where information about the psychometric function is maximised. We investigate the influence of adaptation on statistical inference about the psychometric function focusing on estimation, considering parametric and non-parametric estimation under both fixed and adaptive designs and under within-subject independence as well as dependence. We study the scenarios analytically and numerically through a simulation study. We show that while asymptotic properties of estimators are preserved under adaptation, the adaptive nature of the design introduces small-sample bias, in particular in the slope parameter of the psychometric function. We supply an explanation of this phenomenon that formalises and supplements the one found in the literature. We argue that this poses a dilemma for studies applying an adaptive design in the form of a trade-off between more efficient sampling and the need to increase the number of samples to ameliorate small-sample bias.</div></div>","PeriodicalId":50140,"journal":{"name":"Journal of Mathematical Psychology","volume":"124 ","pages":"Article 102899"},"PeriodicalIF":2.2000,"publicationDate":"2025-01-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Mathematical Psychology","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S002224962500001X","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
An adaptive design adjusts dynamically as information is accrued. In psychometrics and psychophysics, a class of studies investigates a subject’s ability to perform tasks as a function of the stimulus intensity, ie the amount or clarity of information supplied for the task. The relationship between performance and intensity is represented by a psychometric function. Such experiments routinely apply adaptive designs using both previous intensities and performance to assign stimulus intensities, the strategy being to sample intensities where information about the psychometric function is maximised. We investigate the influence of adaptation on statistical inference about the psychometric function focusing on estimation, considering parametric and non-parametric estimation under both fixed and adaptive designs and under within-subject independence as well as dependence. We study the scenarios analytically and numerically through a simulation study. We show that while asymptotic properties of estimators are preserved under adaptation, the adaptive nature of the design introduces small-sample bias, in particular in the slope parameter of the psychometric function. We supply an explanation of this phenomenon that formalises and supplements the one found in the literature. We argue that this poses a dilemma for studies applying an adaptive design in the form of a trade-off between more efficient sampling and the need to increase the number of samples to ameliorate small-sample bias.
期刊介绍:
The Journal of Mathematical Psychology includes articles, monographs and reviews, notes and commentaries, and book reviews in all areas of mathematical psychology. Empirical and theoretical contributions are equally welcome.
Areas of special interest include, but are not limited to, fundamental measurement and psychological process models, such as those based upon neural network or information processing concepts. A partial listing of substantive areas covered include sensation and perception, psychophysics, learning and memory, problem solving, judgment and decision-making, and motivation.
The Journal of Mathematical Psychology is affiliated with the Society for Mathematical Psychology.
Research Areas include:
• Models for sensation and perception, learning, memory and thinking
• Fundamental measurement and scaling
• Decision making
• Neural modeling and networks
• Psychophysics and signal detection
• Neuropsychological theories
• Psycholinguistics
• Motivational dynamics
• Animal behavior
• Psychometric theory