Investigating the effect of experience sampling study design on careless and insufficient effort responding identified with a screen-time-based mixture model.
Esther Ulitzsch,Wolfgang Viechtbauer,Oliver Lüdtke,Inez Myin-Germeys,Gabriel Nagy,Steffen Nestler,Gudrun Vera Eisele
{"title":"Investigating the effect of experience sampling study design on careless and insufficient effort responding identified with a screen-time-based mixture model.","authors":"Esther Ulitzsch,Wolfgang Viechtbauer,Oliver Lüdtke,Inez Myin-Germeys,Gabriel Nagy,Steffen Nestler,Gudrun Vera Eisele","doi":"10.1037/pas0001379","DOIUrl":null,"url":null,"abstract":"When using the experience sampling method (ESM), researchers must navigate a delicate balance between obtaining fine-grained snapshots of phenomena of interest and avoiding undue respondent burden, which can lead to disengagement and compromise data quality. To guide that process, we investigated how questionnaire length and sampling frequency impact careless and insufficient effort responding (C/IER) as an important yet understudied aspect of ESM data quality. To this end, we made use of existing experimental ESM data (Eisele et al., 2022) from 163 students randomly assigned to one of two questionnaire lengths (30/60 items) and one of three sampling frequencies (3/6/9 assessments per day). We employed a novel mixture modeling approach (Ulitzsch, Nestler, et al., 2024) that leverages screen time data to disentangle attentive responding from C/IER and allows investigating how the occurrence of C/IER evolved within and across ESM study days. We found sampling frequency, but not questionnaire length, impacted C/IER, with higher frequencies resulting in higher overall C/IER proportions and sharper increases of C/IER across, but not within days. These effects proved robust across various model specifications. Further, we found no substantial relationships between model-implied C/IER and other engagement measures, such as self-reported attentiveness, attention checks, response-pattern-based attentiveness indicators, and compliance. Our findings contrast previous studies on noncompliance, suggesting that respondents may employ different strategies to lower the different types of burden imposed by questionnaire length and sampling frequency. Implications for designing ESM studies are discussed. (PsycInfo Database Record (c) 2025 APA, all rights reserved).","PeriodicalId":20770,"journal":{"name":"Psychological Assessment","volume":"26 1","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2025-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychological Assessment","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/pas0001379","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, CLINICAL","Score":null,"Total":0}
引用次数: 0
Abstract
When using the experience sampling method (ESM), researchers must navigate a delicate balance between obtaining fine-grained snapshots of phenomena of interest and avoiding undue respondent burden, which can lead to disengagement and compromise data quality. To guide that process, we investigated how questionnaire length and sampling frequency impact careless and insufficient effort responding (C/IER) as an important yet understudied aspect of ESM data quality. To this end, we made use of existing experimental ESM data (Eisele et al., 2022) from 163 students randomly assigned to one of two questionnaire lengths (30/60 items) and one of three sampling frequencies (3/6/9 assessments per day). We employed a novel mixture modeling approach (Ulitzsch, Nestler, et al., 2024) that leverages screen time data to disentangle attentive responding from C/IER and allows investigating how the occurrence of C/IER evolved within and across ESM study days. We found sampling frequency, but not questionnaire length, impacted C/IER, with higher frequencies resulting in higher overall C/IER proportions and sharper increases of C/IER across, but not within days. These effects proved robust across various model specifications. Further, we found no substantial relationships between model-implied C/IER and other engagement measures, such as self-reported attentiveness, attention checks, response-pattern-based attentiveness indicators, and compliance. Our findings contrast previous studies on noncompliance, suggesting that respondents may employ different strategies to lower the different types of burden imposed by questionnaire length and sampling frequency. Implications for designing ESM studies are discussed. (PsycInfo Database Record (c) 2025 APA, all rights reserved).
期刊介绍:
Psychological Assessment is concerned mainly with empirical research on measurement and evaluation relevant to the broad field of clinical psychology. Submissions are welcome in the areas of assessment processes and methods. Included are - clinical judgment and the application of decision-making models - paradigms derived from basic psychological research in cognition, personality–social psychology, and biological psychology - development, validation, and application of assessment instruments, observational methods, and interviews