Zoe Co, David W Bates, Jessica M Cole, Raj Ratwani, David C Classen
{"title":"Assessing Medication CDS Usability: Pilot Results from 10 Outpatient Clinics.","authors":"Zoe Co, David W Bates, Jessica M Cole, Raj Ratwani, David C Classen","doi":"10.1055/a-2647-1069","DOIUrl":null,"url":null,"abstract":"<p><p>This study aimed to develop a human factors assessment for medication-related clinical decision support (CDS) based on a previously validated tool that assessed the integration of human factors principles in CDS, the instrument for evaluating human factors principles in medication-related decision support alerts (I-MeDeSA), and pilot it with 10 outpatient clinics across the United States.The human factors assessment was developed based on past validations of I-MeDeSA. Examples included changing the wording of questions and reformatting answer choices to check-box options, allowing for multiple answer choices. We also added a section about how clinicians resolved alerts. Clinics received a percentage score based on how well their CDS adhered to human factors principles. To take the assessment, testing teams at each clinic triggered a high-severity drug-drug interaction (DDI) alert, and then took the human factors assessment. This assessment was piloted in 10 outpatient clinics, each of which used a different commercial electronic health record (EHR) system.The final assessment included five sections and twelve questions related to aspects like the timing, visual aspect, severity, content, and actions within the DDI alert. The mean overall percentage score was 62%. The sections regarding the timing and visual aspects of the alert were ones where clinics' EHRs performed the best. However, in the \"actions\" section, 40% of the clinics could bypass high severity alerts without any safeguards in place.We found substantial variability in the integration of human factors principles in the design and delivery of DDI alerts among the outpatient clinics, and some lacked important medication safeguards. This assessment can be used by outpatient clinics for safety improvement initiatives.</p>","PeriodicalId":48956,"journal":{"name":"Applied Clinical Informatics","volume":"16 4","pages":"879-891"},"PeriodicalIF":2.2000,"publicationDate":"2025-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12367376/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Clinical Informatics","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1055/a-2647-1069","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/8/20 0:00:00","PubModel":"Epub","JCR":"Q4","JCRName":"MEDICAL INFORMATICS","Score":null,"Total":0}
引用次数: 0
Abstract
This study aimed to develop a human factors assessment for medication-related clinical decision support (CDS) based on a previously validated tool that assessed the integration of human factors principles in CDS, the instrument for evaluating human factors principles in medication-related decision support alerts (I-MeDeSA), and pilot it with 10 outpatient clinics across the United States.The human factors assessment was developed based on past validations of I-MeDeSA. Examples included changing the wording of questions and reformatting answer choices to check-box options, allowing for multiple answer choices. We also added a section about how clinicians resolved alerts. Clinics received a percentage score based on how well their CDS adhered to human factors principles. To take the assessment, testing teams at each clinic triggered a high-severity drug-drug interaction (DDI) alert, and then took the human factors assessment. This assessment was piloted in 10 outpatient clinics, each of which used a different commercial electronic health record (EHR) system.The final assessment included five sections and twelve questions related to aspects like the timing, visual aspect, severity, content, and actions within the DDI alert. The mean overall percentage score was 62%. The sections regarding the timing and visual aspects of the alert were ones where clinics' EHRs performed the best. However, in the "actions" section, 40% of the clinics could bypass high severity alerts without any safeguards in place.We found substantial variability in the integration of human factors principles in the design and delivery of DDI alerts among the outpatient clinics, and some lacked important medication safeguards. This assessment can be used by outpatient clinics for safety improvement initiatives.
期刊介绍:
ACI is the third Schattauer journal dealing with biomedical and health informatics. It perfectly complements our other journals Öffnet internen Link im aktuellen FensterMethods of Information in Medicine and the Öffnet internen Link im aktuellen FensterYearbook of Medical Informatics. The Yearbook of Medical Informatics being the “Milestone” or state-of-the-art journal and Methods of Information in Medicine being the “Science and Research” journal of IMIA, ACI intends to be the “Practical” journal of IMIA.