Jan I Failenschmid, Leonie V D E Vogelsmeier, Joris Mulder, Joran Jongerling
{"title":"Modelling non-linear psychological processes: Reviewing and evaluating non-parametric approaches and their applicability to intensive longitudinal data.","authors":"Jan I Failenschmid, Leonie V D E Vogelsmeier, Joris Mulder, Joran Jongerling","doi":"10.1111/bmsp.12397","DOIUrl":"https://doi.org/10.1111/bmsp.12397","url":null,"abstract":"<p><p>Psychological concepts are increasingly understood as complex dynamic systems that change over time. To study these complex systems, researchers are increasingly gathering intensive longitudinal data (ILD), revealing non-linear phenomena such as asymptotic growth, mean-level switching, and regulatory oscillations. However, psychological researchers currently lack advanced statistical methods that are flexible enough to capture these non-linear processes accurately, which hinders theory development. While methods such as local polynomial regression, Gaussian processes and generalized additive models (GAMs) exist outside of psychology, they are rarely applied within the field because they have not yet been reviewed accessibly and evaluated within the context of ILD. To address this important gap, this article introduces these three methods for an applied psychological audience. We further conducted a simulation study, which demonstrates that all three methods infer non-linear processes that have been found in ILD more accurately than polynomial regression. Particularly, GAMs closely captured the underlying processes, performing almost as well as the data-generating parametric models. Finally, we illustrate how GAMs can be applied to explore idiographic processes and identify potential phenomena in ILD. This comprehensive analysis empowers psychological researchers to model non-linear processes accurately and select a method that aligns with their data and research goals.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144180327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A path signature perspective of process data feature extraction.","authors":"Xueying Tang, Jingchen Liu, Zhiliang Ying","doi":"10.1111/bmsp.12390","DOIUrl":"https://doi.org/10.1111/bmsp.12390","url":null,"abstract":"<p><p>Computer-based interactive items have become prevalent in recent educational assessments. In such items, the entire human-computer interactive process is recorded in a log file and is known as the response process. These data are noisy, diverse, and in a nonstandard format. Several feature extraction methods have been developed to overcome the difficulties in process data analysis. However, these methods often focus on the action sequence and ignore the time sequence in response processes. In this paper, we introduce a new feature extraction method that incorporates the information in both the action sequence and the response time sequence. The method is based on the concept of path signature from stochastic analysis. We apply the proposed method to both simulated data and real response process data from PIAAC. A prediction framework is used to show that taking time information into account provides a more comprehensive understanding of respondents' behaviors.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144144399","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A supervised learning approach to estimating IRT models in small samples.","authors":"Dmitry I Belov, Oliver Lüdtke, Esther Ulitzsch","doi":"10.1111/bmsp.12396","DOIUrl":"https://doi.org/10.1111/bmsp.12396","url":null,"abstract":"<p><p>Existing estimators of parameters of item response theory (IRT) models exploit the likelihood function. In small samples, however, the IRT likelihood oftentimes contains little informative value, potentially resulting in biased and/or unstable parameter estimates and large standard errors. To facilitate small-sample IRT estimation, we introduce a novel approach that does not rely on the likelihood. Our estimation approach derives features from response data and then maps the features to item parameters using a neural network (NN). We describe and evaluate our approach for the three-parameter logistic model; however, it is applicable to any model with an item characteristic curve. Three types of NNs are developed, supporting the obtainment of both point estimates and confidence intervals for IRT model parameters. The results of a simulation study demonstrate that these NNs perform better than Bayesian estimation using Markov chain Monte Carlo methods in terms of the quality of the point estimates and confidence intervals while also being much faster. These properties facilitate (1) pretesting items in a real-time testing environment, (2) pretesting more items and (3) pretesting items only in a secured environment to eradicate possible compromise of new items in online testing.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144082247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Steffen Zitzmann, Christoph Lindner, Julian F Lohmann, Martin Hecht
{"title":"A novel nonvisual procedure for screening for nonstationarity in time series as obtained from intensive longitudinal designs.","authors":"Steffen Zitzmann, Christoph Lindner, Julian F Lohmann, Martin Hecht","doi":"10.1111/bmsp.12394","DOIUrl":"https://doi.org/10.1111/bmsp.12394","url":null,"abstract":"<p><p>Researchers working with intensive longitudinal designs often encounter the challenge of determining whether to relax the assumption of stationarity in their models. Given that these designs typically involve data from a large number of subjects ( <math> <semantics><mrow><mi>N</mi> <mo>≫</mo> <mn>1</mn></mrow> <annotation>$$ Ngg 1 $$</annotation></semantics> </math> ), visual screening all time series can quickly become tedious. Even when conducted by experts, such screenings can lack accuracy. In this article, we propose a nonvisual procedure that enables fast and accurate screening. This procedure has potential to become a widely adopted approach for detecting nonstationarity and guiding model building in psychology and related fields, where intensive longitudinal designs are used and time series data are collected.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144054625","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pablo Nájera, Rodrigo S Kreitchmann, Scarlett Escudero, Francisco J Abad, Jimmy de la Torre, Miguel A Sorrel
{"title":"A general diagnostic modelling framework for forced-choice assessments.","authors":"Pablo Nájera, Rodrigo S Kreitchmann, Scarlett Escudero, Francisco J Abad, Jimmy de la Torre, Miguel A Sorrel","doi":"10.1111/bmsp.12393","DOIUrl":"https://doi.org/10.1111/bmsp.12393","url":null,"abstract":"<p><p>Diagnostic classification modelling (DCM) is a family of restricted latent class models often used in educational settings to assess students' strengths and weaknesses. Recently, there has been growing interest in applying DCM to noncognitive traits in fields such as clinical and organizational psychology, as well as personality profiling. To address common response biases in these assessments, such as social desirability, Huang (2023, Educational and Psychological Measurement, 83, 146) adopted the forced-choice (FC) item format within the DCM framework, developing the FC-DCM. This model assumes that examinees with no clear preference for any statements in an FC block will choose completely at random. Additionally, the unique parametrization of the FC-DCM poses challenges for integration with established DCM frameworks in the literature. In the present study, we enhance the capabilities of DCM by introducing a general diagnostic framework for FC assessments. We present an adaptation of the G-DINA model to accommodate FC responses. Simulation results show that the G-DINA model provides accurate classifications, item parameter estimates and attribute correlations, outperforming the FC-DCM in realistic scenarios where item discrimination varies. A real FC assessment example further illustrates the better model fit of the G-DINA. Practical recommendations for using the FC format in diagnostic assessments of noncognitive traits are provided.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144035912","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Score-based tests for parameter instability in ordinal factor models.","authors":"Franz Classe, Rudolf Debelak, Christoph Kern","doi":"10.1111/bmsp.12392","DOIUrl":"https://doi.org/10.1111/bmsp.12392","url":null,"abstract":"<p><p>We present a novel approach for computing model scores for ordinal factor models, that is, graded response models (GRMs) fitted with a limited information (LI) estimator. The method makes it possible to compute score-based tests for parameter instability for ordinal factor models. This way, rapid execution of numerous parameter instability tests for multidimensional item response theory (MIRT) models is facilitated. We present a comparative analysis of the performance of the proposed score-based tests for ordinal factor models in comparison to tests for GRMs fitted with a full information (FI) estimator. The new method has a good Type I error rate, high power and is computationally faster than FI estimation. We further illustrate that the proposed method works well with complex models in real data applications. The method is implemented in the lavaan package in R.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"144052855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dexin Shi, Bo Zhang, Wolfgang Wiedermann, Amanda J Fairchild
{"title":"Distinguishing cause from effect in psychological research: An independence-based approach under linear non-Gaussian models.","authors":"Dexin Shi, Bo Zhang, Wolfgang Wiedermann, Amanda J Fairchild","doi":"10.1111/bmsp.12391","DOIUrl":"https://doi.org/10.1111/bmsp.12391","url":null,"abstract":"<p><p>Distinguishing cause from effect - that is, determining whether x causes y (x → y) or, alternatively, whether y causes x (y → x) - is a primary research goal in many psychological research areas. Despite its importance, determining causal direction with observational data remains a difficult task. In this study, we introduce an independence-based approach for causal discovery between two variables of interest under a linear non-Gaussian model framework. We propose a two-step algorithm based on distance correlations that provides empirical conclusions on the causal directionality of effects under realistic conditions typically seen in psychological studies, that is, in the presence of hidden confounders. The performance of the proposed algorithm is evaluated using Monte-Carlo simulations. Findings suggest that the algorithm can effectively detect the causal direction between two variables of interest, even in the presence of weak hidden confounders. Moreover, distance correlations provide useful insights into the magnitude of hidden confounding. We provide an empirical example to demonstrate the application of our proposed approach and discuss practical implications and future directions.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-04-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143998166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effect sizes for experimental research.","authors":"Larry V Hedges","doi":"10.1111/bmsp.12389","DOIUrl":"https://doi.org/10.1111/bmsp.12389","url":null,"abstract":"<p><p>Good scientific practice requires that the reporting of the statistical analysis of experiments should include estimates of effect size as well as the results of tests of statistical significance. Good statistical practice requires that effect size estimates be reported along with some indication of their statistical uncertainty, such as a standard error. This article provides a review of effect sizes for experimental research, including expressions for the standard error of each effect size. It focuses on effect sizes for experiments with treatments having a single degree of freedom but also includes effect sizes for treatments with multiple degrees of freedom having either fixed or random effects.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-03-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143755694","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fusion of score-differencing and response similarity statistics for detecting examinees with item preknowledge.","authors":"Yongze Xu, Ruihang He, Meiwei Huang, Fang Luo","doi":"10.1111/bmsp.12388","DOIUrl":"https://doi.org/10.1111/bmsp.12388","url":null,"abstract":"<p><p>Item preknowledge (IP) is a prevalent form of test fraud in educational assessment that can compromise test validity. Two common methods for detecting examinees with IP are score-differencing statistics and response similarity index (RSI). These statistics have different applications and respective advantages. In this paper, we propose a new method (Joint Survival Function Method, <math> <semantics><mrow><mtext>JSFM</mtext></mrow> <annotation>$$ mathrm{JSFM} $$</annotation></semantics> </math> ) to combine these two types of statistics to calculate a fusion statistic that tries to address the issue of distribution differences between the original indicators. By combining the advantages of the original indicators, the fusion statistic can more effectively detect examinees with IP. We fused two typical RSI and four typical score-differencing statistics using different methods and compared their performance. The results demonstrate that the proposed <math> <semantics><mrow><mtext>JSFM</mtext></mrow> <annotation>$$ mathrm{JSFM} $$</annotation></semantics> </math> exhibits strong cross-scenario stability and performs better than other fusion methods.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143712220","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jonas Bjermo, Ellinor Fackle-Fornius, Frank Miller
{"title":"Optimizing calibration designs with uncertainty in abilities.","authors":"Jonas Bjermo, Ellinor Fackle-Fornius, Frank Miller","doi":"10.1111/bmsp.12387","DOIUrl":"https://doi.org/10.1111/bmsp.12387","url":null,"abstract":"<p><p>Before items can be implemented in a test, the item characteristics need to be calibrated through pretesting. To achieve high-quality tests, it's crucial to maximize the precision of estimates obtained during item calibration. Higher precision can be attained if calibration items are allocated to examinees based on their individual abilities. Methods from optimal experimental design can be used to derive an optimal ability-matched calibration design. However, such an optimal design assumes known abilities of the examinees. In practice, the abilities are unknown and estimated based on a limited number of operational items. We develop the theory for handling the uncertainty in abilities in a proper way and show how the optimal calibration design can be derived when taking account of this uncertainty. We demonstrate that the derived designs are more robust when the uncertainty in abilities is acknowledged. Additionally, the method has been implemented in the R-package optical.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":" ","pages":""},"PeriodicalIF":1.5,"publicationDate":"2025-03-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143598524","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}