Ranish K Patel, Phillip D Jenkins, Emily Lee, James Nitzkorwski, Ramanathan Seshadri, Rebecca Rhee, Mackenzie Cook, Julia Shelton, Julie Doberne, Jonathan Jesneck, Ruchi Thanawala
{"title":"外科教育中教师评估行为的分析与预测。","authors":"Ranish K Patel, Phillip D Jenkins, Emily Lee, James Nitzkorwski, Ramanathan Seshadri, Rebecca Rhee, Mackenzie Cook, Julia Shelton, Julie Doberne, Jonathan Jesneck, Ruchi Thanawala","doi":"10.1016/j.jsurg.2025.103737","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>To study and profile the digital assessment behaviors of surgical faculty and residents, and to build a classifier to predict assessment completion, enhancing formative feedback initiatives.</p><p><strong>Background: </strong>As competency-based paradigms are integrated into surgical training, developing digital education tools for measuring competency and providing rapid feedback is crucial. Simply making assessments available is inadequate and results in disappointingly low user participation. To optimize engagement and efficacy of these tools, user assessment behaviors need to be studied.</p><p><strong>Methods: </strong>User data was aggregated from a HIPAA-compliant electronic health record (EHR)-integrated medical education platform. Faculty and resident behaviors were analyzed with respect to factors, such as time, day, device type, automated reminders, and EHR integration.</p><p><strong>Results: </strong>10,729 assessments were completed by 254 attendings for 428 residents across 22 institutions, from 2022 to 2024. 86% of assessments were completed by faculty on weekdays, were significantly influenced by automated platform triggers and EHR integration, and distinct faculty behavior profiles contingent upon time to completion and comment length were established. Residents opened assessments at a median of 1.5 hours of faculty assessment completion, with 96% of assessments viewed by 24 hours.</p><p><strong>Conclusions: </strong>Faculty assessment behaviors represent an actionable bottleneck which can be leveraged to optimize and tailor the design of digital education tools, to enhance formative feedback.</p>","PeriodicalId":94109,"journal":{"name":"Journal of surgical education","volume":"82 11","pages":"103737"},"PeriodicalIF":2.1000,"publicationDate":"2025-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Profiling and Predicting Faculty Assessment Behavior in Surgical Education.\",\"authors\":\"Ranish K Patel, Phillip D Jenkins, Emily Lee, James Nitzkorwski, Ramanathan Seshadri, Rebecca Rhee, Mackenzie Cook, Julia Shelton, Julie Doberne, Jonathan Jesneck, Ruchi Thanawala\",\"doi\":\"10.1016/j.jsurg.2025.103737\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Objective: </strong>To study and profile the digital assessment behaviors of surgical faculty and residents, and to build a classifier to predict assessment completion, enhancing formative feedback initiatives.</p><p><strong>Background: </strong>As competency-based paradigms are integrated into surgical training, developing digital education tools for measuring competency and providing rapid feedback is crucial. Simply making assessments available is inadequate and results in disappointingly low user participation. To optimize engagement and efficacy of these tools, user assessment behaviors need to be studied.</p><p><strong>Methods: </strong>User data was aggregated from a HIPAA-compliant electronic health record (EHR)-integrated medical education platform. Faculty and resident behaviors were analyzed with respect to factors, such as time, day, device type, automated reminders, and EHR integration.</p><p><strong>Results: </strong>10,729 assessments were completed by 254 attendings for 428 residents across 22 institutions, from 2022 to 2024. 86% of assessments were completed by faculty on weekdays, were significantly influenced by automated platform triggers and EHR integration, and distinct faculty behavior profiles contingent upon time to completion and comment length were established. Residents opened assessments at a median of 1.5 hours of faculty assessment completion, with 96% of assessments viewed by 24 hours.</p><p><strong>Conclusions: </strong>Faculty assessment behaviors represent an actionable bottleneck which can be leveraged to optimize and tailor the design of digital education tools, to enhance formative feedback.</p>\",\"PeriodicalId\":94109,\"journal\":{\"name\":\"Journal of surgical education\",\"volume\":\"82 11\",\"pages\":\"103737\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2025-10-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of surgical education\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1016/j.jsurg.2025.103737\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of surgical education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1016/j.jsurg.2025.103737","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Profiling and Predicting Faculty Assessment Behavior in Surgical Education.
Objective: To study and profile the digital assessment behaviors of surgical faculty and residents, and to build a classifier to predict assessment completion, enhancing formative feedback initiatives.
Background: As competency-based paradigms are integrated into surgical training, developing digital education tools for measuring competency and providing rapid feedback is crucial. Simply making assessments available is inadequate and results in disappointingly low user participation. To optimize engagement and efficacy of these tools, user assessment behaviors need to be studied.
Methods: User data was aggregated from a HIPAA-compliant electronic health record (EHR)-integrated medical education platform. Faculty and resident behaviors were analyzed with respect to factors, such as time, day, device type, automated reminders, and EHR integration.
Results: 10,729 assessments were completed by 254 attendings for 428 residents across 22 institutions, from 2022 to 2024. 86% of assessments were completed by faculty on weekdays, were significantly influenced by automated platform triggers and EHR integration, and distinct faculty behavior profiles contingent upon time to completion and comment length were established. Residents opened assessments at a median of 1.5 hours of faculty assessment completion, with 96% of assessments viewed by 24 hours.
Conclusions: Faculty assessment behaviors represent an actionable bottleneck which can be leveraged to optimize and tailor the design of digital education tools, to enhance formative feedback.