Stacey Bregman, Elana Thau, Martin Pusic, Manuela Perez, Kathy Boutis
{"title":"以绩效为基础的执业医师小儿胸片解读能力评估。","authors":"Stacey Bregman, Elana Thau, Martin Pusic, Manuela Perez, Kathy Boutis","doi":"10.1097/CEH.0000000000000481","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>There is limited knowledge on pediatric chest radiograph (pCXR) interpretation skill among practicing physicians. We systematically determined baseline interpretation skill, the number of pCXR cases physicians required complete to achieve a performance benchmark, and which diagnoses posed the greatest diagnostic challenge.</p><p><strong>Methods: </strong>Physicians interpreted 434 pCXR cases via a web-based platform until they achieved a performance benchmark of 85% accuracy, sensitivity, and specificity. Interpretation difficulty scores for each case were derived by applying one-parameter item response theory to participant data. We compared interpretation difficulty scores across diagnostic categories and described the diagnoses of the 30% most difficult-to-interpret cases.</p><p><strong>Results: </strong>240 physicians who practice in one of three geographic areas interpreted cases, yielding 56,833 pCXR case interpretations. The initial diagnostic performance (first 50 cases) of our participants demonstrated an accuracy of 68.9%, sensitivity of 69.4%, and a specificity of 68.4%. The median number of cases completed to achieve the performance benchmark was 102 (interquartile range 69, 176; min, max, 54, 431). Among the 30% most difficult-to-interpret cases, 39.2% were normal pCXR and 32.3% were cases of lobar pneumonia. Cases with a single trauma-related imaging finding, cardiac, hilar, and diaphragmatic pathologies were also among the most challenging.</p><p><strong>Discussion: </strong>At baseline, practicing physicians misdiagnosed about one-third of pCXR and there was up to an eight-fold difference between participants in number of cases completed to achieve the standardized performance benchmark. We also identified the diagnoses with the greatest potential for educational intervention.</p>","PeriodicalId":50218,"journal":{"name":"Journal of Continuing Education in the Health Professions","volume":" ","pages":"28-34"},"PeriodicalIF":1.6000,"publicationDate":"2024-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Performance-Based Competency Assessment of Pediatric Chest Radiograph Interpretation Among Practicing Physicians.\",\"authors\":\"Stacey Bregman, Elana Thau, Martin Pusic, Manuela Perez, Kathy Boutis\",\"doi\":\"10.1097/CEH.0000000000000481\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Introduction: </strong>There is limited knowledge on pediatric chest radiograph (pCXR) interpretation skill among practicing physicians. We systematically determined baseline interpretation skill, the number of pCXR cases physicians required complete to achieve a performance benchmark, and which diagnoses posed the greatest diagnostic challenge.</p><p><strong>Methods: </strong>Physicians interpreted 434 pCXR cases via a web-based platform until they achieved a performance benchmark of 85% accuracy, sensitivity, and specificity. Interpretation difficulty scores for each case were derived by applying one-parameter item response theory to participant data. We compared interpretation difficulty scores across diagnostic categories and described the diagnoses of the 30% most difficult-to-interpret cases.</p><p><strong>Results: </strong>240 physicians who practice in one of three geographic areas interpreted cases, yielding 56,833 pCXR case interpretations. The initial diagnostic performance (first 50 cases) of our participants demonstrated an accuracy of 68.9%, sensitivity of 69.4%, and a specificity of 68.4%. The median number of cases completed to achieve the performance benchmark was 102 (interquartile range 69, 176; min, max, 54, 431). Among the 30% most difficult-to-interpret cases, 39.2% were normal pCXR and 32.3% were cases of lobar pneumonia. Cases with a single trauma-related imaging finding, cardiac, hilar, and diaphragmatic pathologies were also among the most challenging.</p><p><strong>Discussion: </strong>At baseline, practicing physicians misdiagnosed about one-third of pCXR and there was up to an eight-fold difference between participants in number of cases completed to achieve the standardized performance benchmark. We also identified the diagnoses with the greatest potential for educational intervention.</p>\",\"PeriodicalId\":50218,\"journal\":{\"name\":\"Journal of Continuing Education in the Health Professions\",\"volume\":\" \",\"pages\":\"28-34\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Continuing Education in the Health Professions\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1097/CEH.0000000000000481\",\"RegionNum\":4,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2022/12/21 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q2\",\"JCRName\":\"EDUCATION, SCIENTIFIC DISCIPLINES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Continuing Education in the Health Professions","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1097/CEH.0000000000000481","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/12/21 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
A Performance-Based Competency Assessment of Pediatric Chest Radiograph Interpretation Among Practicing Physicians.
Introduction: There is limited knowledge on pediatric chest radiograph (pCXR) interpretation skill among practicing physicians. We systematically determined baseline interpretation skill, the number of pCXR cases physicians required complete to achieve a performance benchmark, and which diagnoses posed the greatest diagnostic challenge.
Methods: Physicians interpreted 434 pCXR cases via a web-based platform until they achieved a performance benchmark of 85% accuracy, sensitivity, and specificity. Interpretation difficulty scores for each case were derived by applying one-parameter item response theory to participant data. We compared interpretation difficulty scores across diagnostic categories and described the diagnoses of the 30% most difficult-to-interpret cases.
Results: 240 physicians who practice in one of three geographic areas interpreted cases, yielding 56,833 pCXR case interpretations. The initial diagnostic performance (first 50 cases) of our participants demonstrated an accuracy of 68.9%, sensitivity of 69.4%, and a specificity of 68.4%. The median number of cases completed to achieve the performance benchmark was 102 (interquartile range 69, 176; min, max, 54, 431). Among the 30% most difficult-to-interpret cases, 39.2% were normal pCXR and 32.3% were cases of lobar pneumonia. Cases with a single trauma-related imaging finding, cardiac, hilar, and diaphragmatic pathologies were also among the most challenging.
Discussion: At baseline, practicing physicians misdiagnosed about one-third of pCXR and there was up to an eight-fold difference between participants in number of cases completed to achieve the standardized performance benchmark. We also identified the diagnoses with the greatest potential for educational intervention.
期刊介绍:
The Journal of Continuing Education is a quarterly journal publishing articles relevant to theory, practice, and policy development for continuing education in the health sciences. The journal presents original research and essays on subjects involving the lifelong learning of professionals, with a focus on continuous quality improvement, competency assessment, and knowledge translation. It provides thoughtful advice to those who develop, conduct, and evaluate continuing education programs.