{"title":"Analyzing Count Data in Single Case Experimental Designs with Generalized Linear Mixed Models: Does Serial Dependency Matter?","authors":"Haoran Li, Wen Luo","doi":"10.1080/00273171.2025.2561945","DOIUrl":null,"url":null,"abstract":"<p><p>Single-case experimental designs (SCEDs) involve repeated measurements of a small number of cases under different experimental conditions, offering valuable insights into treatment effects. However, challenges arise in the analysis of SCEDs when autocorrelation is present in the data. Recently, generalized linear mixed models (GLMMs) have emerged as a promising statistical approach for SCEDs with count outcomes. While prior research has demonstrated the effectiveness of GLMMs, these studies have typically assumed error independence, an assumption that may be violated in SCEDs due to serial dependency. This study aims to evaluate two possible solutions for autocorrelated SCED count data: 1) to assess the robustness of previously introduced GLMMs such as Poisson, negative binomial, and observation-level random effects models under various levels of autocorrelation, and 2) to evaluate the performance of a new GLMM and a linear mixed model (LMM), both of which incorporate an autoregressive error structure. Through a Monte Carlo simulation study, we have examined bias, coverage rates, and Type I error rates of treatment effect estimators, providing recommendations for handling autocorrelation in the analysis of SCED count data. A demonstration with real SCED count data is provided. The implications, limitations, and future research directions are also discussed.</p>","PeriodicalId":53155,"journal":{"name":"Multivariate Behavioral Research","volume":" ","pages":"1-25"},"PeriodicalIF":3.5000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multivariate Behavioral Research","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1080/00273171.2025.2561945","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Single-case experimental designs (SCEDs) involve repeated measurements of a small number of cases under different experimental conditions, offering valuable insights into treatment effects. However, challenges arise in the analysis of SCEDs when autocorrelation is present in the data. Recently, generalized linear mixed models (GLMMs) have emerged as a promising statistical approach for SCEDs with count outcomes. While prior research has demonstrated the effectiveness of GLMMs, these studies have typically assumed error independence, an assumption that may be violated in SCEDs due to serial dependency. This study aims to evaluate two possible solutions for autocorrelated SCED count data: 1) to assess the robustness of previously introduced GLMMs such as Poisson, negative binomial, and observation-level random effects models under various levels of autocorrelation, and 2) to evaluate the performance of a new GLMM and a linear mixed model (LMM), both of which incorporate an autoregressive error structure. Through a Monte Carlo simulation study, we have examined bias, coverage rates, and Type I error rates of treatment effect estimators, providing recommendations for handling autocorrelation in the analysis of SCED count data. A demonstration with real SCED count data is provided. The implications, limitations, and future research directions are also discussed.
期刊介绍:
Multivariate Behavioral Research (MBR) publishes a variety of substantive, methodological, and theoretical articles in all areas of the social and behavioral sciences. Most MBR articles fall into one of two categories. Substantive articles report on applications of sophisticated multivariate research methods to study topics of substantive interest in personality, health, intelligence, industrial/organizational, and other behavioral science areas. Methodological articles present and/or evaluate new developments in multivariate methods, or address methodological issues in current research. We also encourage submission of integrative articles related to pedagogy involving multivariate research methods, and to historical treatments of interest and relevance to multivariate research methods.