Laurie M Aluce, Julie J Cooper, Lillian L Emlet, Elaine R Cohen, Gordon J Wood, Julia H Vermylen
{"title":"评估重大疾病沟通能力:评估员培训计划的发展。","authors":"Laurie M Aluce, Julie J Cooper, Lillian L Emlet, Elaine R Cohen, Gordon J Wood, Julia H Vermylen","doi":"10.1097/CEH.0000000000000613","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Rigorous rater training is necessary to ensure consistent feedback. Yet, there is a lack of published recommendations for how to train raters to provide reliable, consistent assessments for communication skills, thus making competency-based training in this area challenging. We describe a method for conducting rater training for serious illness communication skills and assess interrater reliability.</p><p><strong>Methods: </strong>We selected a previously published and validated tool for assessing serious illness communication skills. We created a rater training program adapted from a previously described program focused on team performance, making notable adjustments to tailor the program to the assessment of serious illness communication with patients given the unique challenges these conversations pose. We assessed interrater reliability at the end of the program using kappa coefficients for dichotomous checklist items and intraclass correlation coefficients for scaled items.</p><p><strong>Results: </strong>Five raters who are physicians with expertise in communication skills training completed the program. After training, raters assessed eight test videos. All raters achieved substantial agreement when compared to the gold standard rater for both the checklist (average overall kappa = 0.83) and scaled items (average overall intraclass correlation coefficient = 0.83).</p><p><strong>Discussion: </strong>We demonstrate an effective method for conducting rater training to assess serious illness communication skills that builds off a previously published program for team performance. Key adjustments included conducting facilitated discussions of videos and iteratively updating the rater training guide. This approach ensures reliable assessment within communication skills training.</p>","PeriodicalId":50218,"journal":{"name":"Journal of Continuing Education in the Health Professions","volume":" ","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2025-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Assessing Competency in Serious Illness Communication: Development of a Rater Training Program.\",\"authors\":\"Laurie M Aluce, Julie J Cooper, Lillian L Emlet, Elaine R Cohen, Gordon J Wood, Julia H Vermylen\",\"doi\":\"10.1097/CEH.0000000000000613\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Introduction: </strong>Rigorous rater training is necessary to ensure consistent feedback. Yet, there is a lack of published recommendations for how to train raters to provide reliable, consistent assessments for communication skills, thus making competency-based training in this area challenging. We describe a method for conducting rater training for serious illness communication skills and assess interrater reliability.</p><p><strong>Methods: </strong>We selected a previously published and validated tool for assessing serious illness communication skills. We created a rater training program adapted from a previously described program focused on team performance, making notable adjustments to tailor the program to the assessment of serious illness communication with patients given the unique challenges these conversations pose. We assessed interrater reliability at the end of the program using kappa coefficients for dichotomous checklist items and intraclass correlation coefficients for scaled items.</p><p><strong>Results: </strong>Five raters who are physicians with expertise in communication skills training completed the program. After training, raters assessed eight test videos. All raters achieved substantial agreement when compared to the gold standard rater for both the checklist (average overall kappa = 0.83) and scaled items (average overall intraclass correlation coefficient = 0.83).</p><p><strong>Discussion: </strong>We demonstrate an effective method for conducting rater training to assess serious illness communication skills that builds off a previously published program for team performance. Key adjustments included conducting facilitated discussions of videos and iteratively updating the rater training guide. This approach ensures reliable assessment within communication skills training.</p>\",\"PeriodicalId\":50218,\"journal\":{\"name\":\"Journal of Continuing Education in the Health Professions\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2025-08-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Continuing Education in the Health Professions\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1097/CEH.0000000000000613\",\"RegionNum\":4,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Continuing Education in the Health Professions","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1097/CEH.0000000000000613","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
Assessing Competency in Serious Illness Communication: Development of a Rater Training Program.
Introduction: Rigorous rater training is necessary to ensure consistent feedback. Yet, there is a lack of published recommendations for how to train raters to provide reliable, consistent assessments for communication skills, thus making competency-based training in this area challenging. We describe a method for conducting rater training for serious illness communication skills and assess interrater reliability.
Methods: We selected a previously published and validated tool for assessing serious illness communication skills. We created a rater training program adapted from a previously described program focused on team performance, making notable adjustments to tailor the program to the assessment of serious illness communication with patients given the unique challenges these conversations pose. We assessed interrater reliability at the end of the program using kappa coefficients for dichotomous checklist items and intraclass correlation coefficients for scaled items.
Results: Five raters who are physicians with expertise in communication skills training completed the program. After training, raters assessed eight test videos. All raters achieved substantial agreement when compared to the gold standard rater for both the checklist (average overall kappa = 0.83) and scaled items (average overall intraclass correlation coefficient = 0.83).
Discussion: We demonstrate an effective method for conducting rater training to assess serious illness communication skills that builds off a previously published program for team performance. Key adjustments included conducting facilitated discussions of videos and iteratively updating the rater training guide. This approach ensures reliable assessment within communication skills training.
期刊介绍:
The Journal of Continuing Education is a quarterly journal publishing articles relevant to theory, practice, and policy development for continuing education in the health sciences. The journal presents original research and essays on subjects involving the lifelong learning of professionals, with a focus on continuous quality improvement, competency assessment, and knowledge translation. It provides thoughtful advice to those who develop, conduct, and evaluate continuing education programs.