David W Price, Ting Wang, Thomas R O'Neill, Andrew Bazemore, Warren P Newton
{"title":"Differences in Physician Performance and Self-rated Confidence on High- and Low-Stakes Knowledge Assessments in Board Certification.","authors":"David W Price, Ting Wang, Thomas R O'Neill, Andrew Bazemore, Warren P Newton","doi":"10.1097/CEH.0000000000000487","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>Evidence links assessment to optimal learning, affirming that physicians are more likely to study, learn, and practice skills when some form of consequence (\"stakes\") may result from an assessment. We lack evidence, however, on how physicians' confidence in their knowledge relates to performance on assessments, and whether this varies based on the stakes of the assessment.</p><p><strong>Methods: </strong>Our retrospective repeated-measures design compared differences in patterns of physician answer accuracy and answer confidence among physicians participating in both a high-stakes and a low-stakes longitudinal assessment of the American Board of Family Medicine.</p><p><strong>Results: </strong>After 1 and 2 years, participants were more often correct but less confident in their accuracy on a higher-stakes longitudinal knowledge assessment compared with a lower-stakes assessment. There were no differences in question difficulty between the two platforms. Variation existed between platforms in time spent answering questions, use of resources to answer questions, and perceived question relevance to practice.</p><p><strong>Discussion: </strong>This novel study of physician certification suggests that the accuracy of physician performance increases with higher stakes, even as self-reported confidence in their knowledge declines. It suggests that physicians may be more engaged in higher-stakes compared with lower-stakes assessments. With medical knowledge growing exponentially, these analyses provide an example of the complementary roles of higher- and lower-stakes knowledge assessment in supporting physician learning during continuing specialty board certification.</p>","PeriodicalId":50218,"journal":{"name":"Journal of Continuing Education in the Health Professions","volume":" ","pages":"2-10"},"PeriodicalIF":1.6000,"publicationDate":"2024-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Continuing Education in the Health Professions","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1097/CEH.0000000000000487","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/3/7 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0
Abstract
Introduction: Evidence links assessment to optimal learning, affirming that physicians are more likely to study, learn, and practice skills when some form of consequence ("stakes") may result from an assessment. We lack evidence, however, on how physicians' confidence in their knowledge relates to performance on assessments, and whether this varies based on the stakes of the assessment.
Methods: Our retrospective repeated-measures design compared differences in patterns of physician answer accuracy and answer confidence among physicians participating in both a high-stakes and a low-stakes longitudinal assessment of the American Board of Family Medicine.
Results: After 1 and 2 years, participants were more often correct but less confident in their accuracy on a higher-stakes longitudinal knowledge assessment compared with a lower-stakes assessment. There were no differences in question difficulty between the two platforms. Variation existed between platforms in time spent answering questions, use of resources to answer questions, and perceived question relevance to practice.
Discussion: This novel study of physician certification suggests that the accuracy of physician performance increases with higher stakes, even as self-reported confidence in their knowledge declines. It suggests that physicians may be more engaged in higher-stakes compared with lower-stakes assessments. With medical knowledge growing exponentially, these analyses provide an example of the complementary roles of higher- and lower-stakes knowledge assessment in supporting physician learning during continuing specialty board certification.
期刊介绍:
The Journal of Continuing Education is a quarterly journal publishing articles relevant to theory, practice, and policy development for continuing education in the health sciences. The journal presents original research and essays on subjects involving the lifelong learning of professionals, with a focus on continuous quality improvement, competency assessment, and knowledge translation. It provides thoughtful advice to those who develop, conduct, and evaluate continuing education programs.