Abigail E Pruitt, Kara M Styck, Annika L Hodges, Christopher J Anthony, Robert J Volpe
{"title":"Do you see what I see? Mitigating rater effects on direct behavior ratings-multi-item scales (DBR-MIS) through training and statistical adjustment.","authors":"Abigail E Pruitt, Kara M Styck, Annika L Hodges, Christopher J Anthony, Robert J Volpe","doi":"10.1037/spq0000698","DOIUrl":null,"url":null,"abstract":"<p><p>The purpose of our study was to compare the effectiveness of rater training and statistical adjustment at mitigating rater effects and improving the accuracy of direct behavior ratings-multi-item scales (DBR-MIS) scores targeting academic engagement and disruptive behavior. Results from a many-facet Rasch measurement analysis with a sample of video clips of 15 middle school students rated by 10 graduate students indicated that raters significantly differed in their tendencies toward severity/leniency and that rater training was only successful at reducing between-rater differences on disruptive behavior. Unfortunately, neither rater training nor statistical adjustment improved DBR-MIS score accuracy when compared to direct observation, though improved accuracy was noted for ratings on a single DBR-MIS disruptive behavior item (i.e., \"noisy\"). School personnel may wish to consider rater training when using DBR-MIS to assess disruptive behavior, and future research should explore the use of statistical modeling to develop customized rater training that incorporates the idiosyncrasies of individual raters. (PsycInfo Database Record (c) 2025 APA, all rights reserved).</p>","PeriodicalId":74763,"journal":{"name":"School psychology (Washington, D.C.)","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"School psychology (Washington, D.C.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1037/spq0000698","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The purpose of our study was to compare the effectiveness of rater training and statistical adjustment at mitigating rater effects and improving the accuracy of direct behavior ratings-multi-item scales (DBR-MIS) scores targeting academic engagement and disruptive behavior. Results from a many-facet Rasch measurement analysis with a sample of video clips of 15 middle school students rated by 10 graduate students indicated that raters significantly differed in their tendencies toward severity/leniency and that rater training was only successful at reducing between-rater differences on disruptive behavior. Unfortunately, neither rater training nor statistical adjustment improved DBR-MIS score accuracy when compared to direct observation, though improved accuracy was noted for ratings on a single DBR-MIS disruptive behavior item (i.e., "noisy"). School personnel may wish to consider rater training when using DBR-MIS to assess disruptive behavior, and future research should explore the use of statistical modeling to develop customized rater training that incorporates the idiosyncrasies of individual raters. (PsycInfo Database Record (c) 2025 APA, all rights reserved).