Profiling and Predicting Faculty Assessment Behavior in Surgical Education.

IF 2.1
Ranish K Patel, Phillip D Jenkins, Emily Lee, James Nitzkorwski, Ramanathan Seshadri, Rebecca Rhee, Mackenzie Cook, Julia Shelton, Julie Doberne, Jonathan Jesneck, Ruchi Thanawala
{"title":"Profiling and Predicting Faculty Assessment Behavior in Surgical Education.","authors":"Ranish K Patel, Phillip D Jenkins, Emily Lee, James Nitzkorwski, Ramanathan Seshadri, Rebecca Rhee, Mackenzie Cook, Julia Shelton, Julie Doberne, Jonathan Jesneck, Ruchi Thanawala","doi":"10.1016/j.jsurg.2025.103737","DOIUrl":null,"url":null,"abstract":"<p><strong>Objective: </strong>To study and profile the digital assessment behaviors of surgical faculty and residents, and to build a classifier to predict assessment completion, enhancing formative feedback initiatives.</p><p><strong>Background: </strong>As competency-based paradigms are integrated into surgical training, developing digital education tools for measuring competency and providing rapid feedback is crucial. Simply making assessments available is inadequate and results in disappointingly low user participation. To optimize engagement and efficacy of these tools, user assessment behaviors need to be studied.</p><p><strong>Methods: </strong>User data was aggregated from a HIPAA-compliant electronic health record (EHR)-integrated medical education platform. Faculty and resident behaviors were analyzed with respect to factors, such as time, day, device type, automated reminders, and EHR integration.</p><p><strong>Results: </strong>10,729 assessments were completed by 254 attendings for 428 residents across 22 institutions, from 2022 to 2024. 86% of assessments were completed by faculty on weekdays, were significantly influenced by automated platform triggers and EHR integration, and distinct faculty behavior profiles contingent upon time to completion and comment length were established. Residents opened assessments at a median of 1.5 hours of faculty assessment completion, with 96% of assessments viewed by 24 hours.</p><p><strong>Conclusions: </strong>Faculty assessment behaviors represent an actionable bottleneck which can be leveraged to optimize and tailor the design of digital education tools, to enhance formative feedback.</p>","PeriodicalId":94109,"journal":{"name":"Journal of surgical education","volume":"82 11","pages":"103737"},"PeriodicalIF":2.1000,"publicationDate":"2025-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of surgical education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1016/j.jsurg.2025.103737","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Objective: To study and profile the digital assessment behaviors of surgical faculty and residents, and to build a classifier to predict assessment completion, enhancing formative feedback initiatives.

Background: As competency-based paradigms are integrated into surgical training, developing digital education tools for measuring competency and providing rapid feedback is crucial. Simply making assessments available is inadequate and results in disappointingly low user participation. To optimize engagement and efficacy of these tools, user assessment behaviors need to be studied.

Methods: User data was aggregated from a HIPAA-compliant electronic health record (EHR)-integrated medical education platform. Faculty and resident behaviors were analyzed with respect to factors, such as time, day, device type, automated reminders, and EHR integration.

Results: 10,729 assessments were completed by 254 attendings for 428 residents across 22 institutions, from 2022 to 2024. 86% of assessments were completed by faculty on weekdays, were significantly influenced by automated platform triggers and EHR integration, and distinct faculty behavior profiles contingent upon time to completion and comment length were established. Residents opened assessments at a median of 1.5 hours of faculty assessment completion, with 96% of assessments viewed by 24 hours.

Conclusions: Faculty assessment behaviors represent an actionable bottleneck which can be leveraged to optimize and tailor the design of digital education tools, to enhance formative feedback.

外科教育中教师评估行为的分析与预测。
目的:研究和分析外科教师和住院医师的数字化评估行为,并建立一个分类器来预测评估完成情况,增强形成性反馈主动性。背景:随着以能力为基础的范例被整合到外科培训中,开发用于测量能力和提供快速反馈的数字教育工具至关重要。简单地提供评估是不够的,会导致令人失望的低用户参与度。为了优化这些工具的参与度和有效性,需要研究用户评估行为。方法:从符合hipaa标准的电子健康档案(EHR)集成医学教育平台收集用户数据。根据时间、日期、设备类型、自动提醒和EHR集成等因素分析了教师和住院医生的行为。结果:从2022年到2024年,22家机构的428名居民的254名主治医生完成了10729项评估。86%的评估是由教师在工作日完成的,受到自动化平台触发和EHR集成的显著影响,并且根据完成时间和评论长度建立了不同的教师行为概况。住院医师在完成教师评估的中位数为1.5小时时开始评估,96%的评估在24小时内完成。结论:教师评估行为是一个可操作的瓶颈,可以利用它来优化和定制数字教育工具的设计,以增强形成性反馈。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信