Tat Shing Yeung , Robert J. Volpe , Amy M. Briesch , Brian Daniels , Gino Casale
{"title":"Dependability of individualized Direct Behavior Rating Multi-Item Scales (DBR-MIS) for academic enablers","authors":"Tat Shing Yeung , Robert J. Volpe , Amy M. Briesch , Brian Daniels , Gino Casale","doi":"10.1016/j.jsp.2024.101389","DOIUrl":null,"url":null,"abstract":"<div><div>The present study examined the dependability of three newly developed direct behavior rating multi-item scales (DBR-MIS) of academic enablers (i.e., academic engagement, interpersonal skills, and study skills). Twenty-two K–5 teachers completed all three 5-item DBR-MIS daily for 1 week for one student in their class. Teachers' ratings on each item during the first occasion were used to create individualized DBR scales with 1–4 items. Items with the lowest ratings (indicating least frequent academic enablers) were included first and subsequent items were added in ascending order. Dependability of both full DBR-MIS and individualized DBR scales was evaluated using generalizability theory. Results indicated that the full DBR-MIS demonstrated high dependability and required only 1–4 assessment occasions (i.e., < 10 as the criterion) to inform absolute decision-making for progress monitoring. The three- and four-item individualized DBR-MIS demonstrated comparable dependability to their respective full five-item DBR-MIS. Dependability estimates of individualized scales in general were higher than standard D study-derived estimates with the same number of items (i.e., dependability estimates obtained by manipulating the number of items from the full standard scales modeled in D studies). Results support continued investigation of the DBR-MIS as a viable progress monitoring tool for school-based applications. Further research and implications for practice were discussed.</div></div>","PeriodicalId":48232,"journal":{"name":"Journal of School Psychology","volume":"107 ","pages":"Article 101389"},"PeriodicalIF":3.8000,"publicationDate":"2024-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of School Psychology","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022440524001092","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, SOCIAL","Score":null,"Total":0}
引用次数: 0
Abstract
The present study examined the dependability of three newly developed direct behavior rating multi-item scales (DBR-MIS) of academic enablers (i.e., academic engagement, interpersonal skills, and study skills). Twenty-two K–5 teachers completed all three 5-item DBR-MIS daily for 1 week for one student in their class. Teachers' ratings on each item during the first occasion were used to create individualized DBR scales with 1–4 items. Items with the lowest ratings (indicating least frequent academic enablers) were included first and subsequent items were added in ascending order. Dependability of both full DBR-MIS and individualized DBR scales was evaluated using generalizability theory. Results indicated that the full DBR-MIS demonstrated high dependability and required only 1–4 assessment occasions (i.e., < 10 as the criterion) to inform absolute decision-making for progress monitoring. The three- and four-item individualized DBR-MIS demonstrated comparable dependability to their respective full five-item DBR-MIS. Dependability estimates of individualized scales in general were higher than standard D study-derived estimates with the same number of items (i.e., dependability estimates obtained by manipulating the number of items from the full standard scales modeled in D studies). Results support continued investigation of the DBR-MIS as a viable progress monitoring tool for school-based applications. Further research and implications for practice were discussed.
期刊介绍:
The Journal of School Psychology publishes original empirical articles and critical reviews of the literature on research and practices relevant to psychological and behavioral processes in school settings. JSP presents research on intervention mechanisms and approaches; schooling effects on the development of social, cognitive, mental-health, and achievement-related outcomes; assessment; and consultation. Submissions from a variety of disciplines are encouraged. All manuscripts are read by the Editor and one or more editorial consultants with the intent of providing appropriate and constructive written reviews.