Simone H Schriger, Steven C Marcus, Emily M Becker-Haimes, Shannon Dorsey, David S Mandell, Bryce D McLeod, Sonja K Schoenwald, Rinad S Beidas
{"title":"测试三种替代方法直接观察测量使用离散青少年认知行为技术:二次分析。","authors":"Simone H Schriger, Steven C Marcus, Emily M Becker-Haimes, Shannon Dorsey, David S Mandell, Bryce D McLeod, Sonja K Schoenwald, Rinad S Beidas","doi":"10.1177/26334895251369899","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>Cognitive behavioral therapy (CBT), an umbrella term for therapeutic techniques guided by cognitive behavioral theory, is an evidence-based approach for many psychiatric conditions in youth. A stronger dose of CBT delivery is thought to improve youth clinical outcomes. While a critical indicator of care quality, measuring the use of CBT techniques feasibly and affordably is challenging. Certain CBT techniques (e.g., more concrete and observable) may be easier to measure than others using low-cost methods, such as clinician self-report; however, this has not been studied.</p><p><strong>Method: </strong>To assess the concordance of three methods of measuring CBT technique use with direct observation (DO), clinicians from 27 community agencies (<i>n</i> = 126; <i>M</i> <sub>age</sub> = 37.7 years, <i>SD</i> = 12.8; 76% female) were randomized 1:1:1 to a self-report, chart-stimulated recall (CSR; semistructured interviews with the chart available), or behavioral rehearsal (BR; simulated role-plays) condition. In previous work using a global score aggregating 12 CBT techniques, only BR produced scores that did not differ from DO. This secondary analysis examined the concordance of these alternate methods with DO for each discrete CBT technique, testing for differential concordance across cognitive techniques (e.g., cognitive education) compared to behavioral techniques (e.g., behavioral activation).</p><p><strong>Results: </strong>Results of three-level mixed effects regression models indicated that BR scores did not differ significantly from DO for any techniques, and for nine techniques, neither did CSR (all <i>p</i>s > .05). Contrastingly, self-report scores differed from DO for all but one technique, with greater concordance for behavioral than cognitive techniques (<i>z</i> = -3.29, <i>p</i> <i><</i> .001).</p><p><strong>Conclusions: </strong>Unlike previous findings using an aggregate score, we found that both BR and CSR did not differ significantly from DO for most techniques tested. These findings have implications within implementation research and usual care settings; they support multiple viable measurement methods that are less resource-intensive than DO.</p>","PeriodicalId":73354,"journal":{"name":"Implementation research and practice","volume":"6 ","pages":"26334895251369899"},"PeriodicalIF":2.6000,"publicationDate":"2025-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12444060/pdf/","citationCount":"0","resultStr":"{\"title\":\"Testing Three Alternate Methods to Direct Observation in Measuring Use of Discrete Youth Cognitive Behavioral Techniques: A Secondary Analysis.\",\"authors\":\"Simone H Schriger, Steven C Marcus, Emily M Becker-Haimes, Shannon Dorsey, David S Mandell, Bryce D McLeod, Sonja K Schoenwald, Rinad S Beidas\",\"doi\":\"10.1177/26334895251369899\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Background: </strong>Cognitive behavioral therapy (CBT), an umbrella term for therapeutic techniques guided by cognitive behavioral theory, is an evidence-based approach for many psychiatric conditions in youth. A stronger dose of CBT delivery is thought to improve youth clinical outcomes. While a critical indicator of care quality, measuring the use of CBT techniques feasibly and affordably is challenging. Certain CBT techniques (e.g., more concrete and observable) may be easier to measure than others using low-cost methods, such as clinician self-report; however, this has not been studied.</p><p><strong>Method: </strong>To assess the concordance of three methods of measuring CBT technique use with direct observation (DO), clinicians from 27 community agencies (<i>n</i> = 126; <i>M</i> <sub>age</sub> = 37.7 years, <i>SD</i> = 12.8; 76% female) were randomized 1:1:1 to a self-report, chart-stimulated recall (CSR; semistructured interviews with the chart available), or behavioral rehearsal (BR; simulated role-plays) condition. In previous work using a global score aggregating 12 CBT techniques, only BR produced scores that did not differ from DO. This secondary analysis examined the concordance of these alternate methods with DO for each discrete CBT technique, testing for differential concordance across cognitive techniques (e.g., cognitive education) compared to behavioral techniques (e.g., behavioral activation).</p><p><strong>Results: </strong>Results of three-level mixed effects regression models indicated that BR scores did not differ significantly from DO for any techniques, and for nine techniques, neither did CSR (all <i>p</i>s > .05). Contrastingly, self-report scores differed from DO for all but one technique, with greater concordance for behavioral than cognitive techniques (<i>z</i> = -3.29, <i>p</i> <i><</i> .001).</p><p><strong>Conclusions: </strong>Unlike previous findings using an aggregate score, we found that both BR and CSR did not differ significantly from DO for most techniques tested. These findings have implications within implementation research and usual care settings; they support multiple viable measurement methods that are less resource-intensive than DO.</p>\",\"PeriodicalId\":73354,\"journal\":{\"name\":\"Implementation research and practice\",\"volume\":\"6 \",\"pages\":\"26334895251369899\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2025-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12444060/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Implementation research and practice\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1177/26334895251369899\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Implementation research and practice","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/26334895251369899","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
Testing Three Alternate Methods to Direct Observation in Measuring Use of Discrete Youth Cognitive Behavioral Techniques: A Secondary Analysis.
Background: Cognitive behavioral therapy (CBT), an umbrella term for therapeutic techniques guided by cognitive behavioral theory, is an evidence-based approach for many psychiatric conditions in youth. A stronger dose of CBT delivery is thought to improve youth clinical outcomes. While a critical indicator of care quality, measuring the use of CBT techniques feasibly and affordably is challenging. Certain CBT techniques (e.g., more concrete and observable) may be easier to measure than others using low-cost methods, such as clinician self-report; however, this has not been studied.
Method: To assess the concordance of three methods of measuring CBT technique use with direct observation (DO), clinicians from 27 community agencies (n = 126; Mage = 37.7 years, SD = 12.8; 76% female) were randomized 1:1:1 to a self-report, chart-stimulated recall (CSR; semistructured interviews with the chart available), or behavioral rehearsal (BR; simulated role-plays) condition. In previous work using a global score aggregating 12 CBT techniques, only BR produced scores that did not differ from DO. This secondary analysis examined the concordance of these alternate methods with DO for each discrete CBT technique, testing for differential concordance across cognitive techniques (e.g., cognitive education) compared to behavioral techniques (e.g., behavioral activation).
Results: Results of three-level mixed effects regression models indicated that BR scores did not differ significantly from DO for any techniques, and for nine techniques, neither did CSR (all ps > .05). Contrastingly, self-report scores differed from DO for all but one technique, with greater concordance for behavioral than cognitive techniques (z = -3.29, p< .001).
Conclusions: Unlike previous findings using an aggregate score, we found that both BR and CSR did not differ significantly from DO for most techniques tested. These findings have implications within implementation research and usual care settings; they support multiple viable measurement methods that are less resource-intensive than DO.