{"title":"One model may not fit all: Subgroup detection using model-based recursive partitioning","authors":"Marjolein Fokkema , Mirka Henninger , Carolin Strobl","doi":"10.1016/j.jsp.2024.101394","DOIUrl":null,"url":null,"abstract":"<div><div>Model-based recursive partitioning (MOB; Zeileis et al., 2008) is a flexible framework for detecting subgroups of persons showing different effects in a wide range of parametric models. It provides a versatile tool for detecting and explaining heterogeneity in, for example, intervention studies. In this tutorial article, we introduce the general MOB framework. In two specific case studies, we illustrate how MOB-based methods can be used to detect and explain heterogeneity in two widely used frameworks in educational studies: (a) The generalized linear mixed model (GLMM) and (b) item response theory (IRT). In the first case study, we show how GLMM trees (Fokkema et al., 2018) can be used to detect subgroups with different parameters in mixed-effects models. We apply GLMM trees to longitudinal data from a study on the effects of the Head Start pre-school program to identify subgroups of families where children show comparatively larger or smaller gains in performance. In a second case study, we show how Rasch trees (Strobl et al., 2015) can be used to detect subgroups with different item parameters in IRT models (i.e. differential item functioning [DIF]). DIF should be investigated before using test results for group comparisons. We show how a recently developed stopping criterion (Henninger et al., 2023) can be used to guide subgroup detection based on DIF effect sizes.</div></div>","PeriodicalId":48232,"journal":{"name":"Journal of School Psychology","volume":"109 ","pages":"Article 101394"},"PeriodicalIF":3.8000,"publicationDate":"2025-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of School Psychology","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022440524001146","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
引用次数: 0
Abstract
Model-based recursive partitioning (MOB; Zeileis et al., 2008) is a flexible framework for detecting subgroups of persons showing different effects in a wide range of parametric models. It provides a versatile tool for detecting and explaining heterogeneity in, for example, intervention studies. In this tutorial article, we introduce the general MOB framework. In two specific case studies, we illustrate how MOB-based methods can be used to detect and explain heterogeneity in two widely used frameworks in educational studies: (a) The generalized linear mixed model (GLMM) and (b) item response theory (IRT). In the first case study, we show how GLMM trees (Fokkema et al., 2018) can be used to detect subgroups with different parameters in mixed-effects models. We apply GLMM trees to longitudinal data from a study on the effects of the Head Start pre-school program to identify subgroups of families where children show comparatively larger or smaller gains in performance. In a second case study, we show how Rasch trees (Strobl et al., 2015) can be used to detect subgroups with different item parameters in IRT models (i.e. differential item functioning [DIF]). DIF should be investigated before using test results for group comparisons. We show how a recently developed stopping criterion (Henninger et al., 2023) can be used to guide subgroup detection based on DIF effect sizes.
基于模型的递归分区;Zeileis et al., 2008)是一个灵活的框架,用于检测在广泛的参数模型中显示不同效果的人的子群体。它为检测和解释例如干预研究中的异质性提供了一个通用的工具。在这篇教程中,我们将介绍一般的MOB框架。在两个具体的案例研究中,我们说明了如何使用基于mobo的方法来检测和解释教育研究中两种广泛使用的框架中的异质性:(a)广义线性混合模型(GLMM)和(b)项目反应理论(IRT)。在第一个案例研究中,我们展示了如何使用GLMM树(Fokkema等人,2018)来检测混合效应模型中具有不同参数的子组。我们将GLMM树应用于一项关于学前教育项目效果研究的纵向数据,以确定儿童表现出相对较大或较小收益的家庭亚组。在第二个案例研究中,我们展示了如何使用Rasch树(strobel等人,2015)来检测IRT模型中具有不同项目参数的子组(即差异项目功能[DIF])。在使用测试结果进行组比较之前,应调查DIF。我们展示了最近开发的停止标准(Henninger et al., 2023)如何用于指导基于DIF效应大小的子组检测。
期刊介绍:
The Journal of School Psychology publishes original empirical articles and critical reviews of the literature on research and practices relevant to psychological and behavioral processes in school settings. JSP presents research on intervention mechanisms and approaches; schooling effects on the development of social, cognitive, mental-health, and achievement-related outcomes; assessment; and consultation. Submissions from a variety of disciplines are encouraged. All manuscripts are read by the Editor and one or more editorial consultants with the intent of providing appropriate and constructive written reviews.