Does Education Design Matter? Evaluating an Evidence-Based Continuing Education Intervention on Genomic Testing for Primary Care; a Pre-Test Post-Test Study.
Sharon Mitchell, Felix M Schmitz, Janusz Janczukowicz, Ann-Lea Buzzi, Noëlle Haas, Tanja Hitzblech, Julia Wagenfuehr, Idris Guessous, Sissel Guttormsen
{"title":"Does Education Design Matter? Evaluating an Evidence-Based Continuing Education Intervention on Genomic Testing for Primary Care; a Pre-Test Post-Test Study.","authors":"Sharon Mitchell, Felix M Schmitz, Janusz Janczukowicz, Ann-Lea Buzzi, Noëlle Haas, Tanja Hitzblech, Julia Wagenfuehr, Idris Guessous, Sissel Guttormsen","doi":"10.1080/28338073.2025.2526234","DOIUrl":null,"url":null,"abstract":"<p><strong>Background: </strong>: Quality continuing education (CE) interventions should be effective, fit-for-purpose, and flexible for healthcare professionals. However, variability in the quality of reported interventions limits their impact. Education providers must ensure well-designed learning experiences to maximise efficiency and relevance. This study details the systematic design of a genomic testing learning intervention, incorporating practical exercises and aligning with educational principles to evaluate its impact on knowledge acquisition, self-efficacy, and skills performance.</p><p><strong>Methods: </strong>: The intervention, conducted in a skills laboratory in Bern, Switzerland, included an interactive online learning module based on learning science principles. Participants engaged in simulated patient (SP) encounters to apply their skills, followed by an informal debriefing session with SPs and content experts. A pre-test post-test study design measured applied knowledge (patient scenario test), self-efficacy (confidence ratings), and skills performance (SP assessments). Wilcoxon tests assessed improvements, Mann-Whitney U tests identified group differences, and Pearson's r calculated effect sizes.</p><p><strong>Results: </strong>: Sixteen participants enrolled, including general practitioners (<i>n</i> = 8) and 4th year medical students (<i>n</i> = 8). In total, the balance of female/male participants was 9(=female)/7(=male), with an overall age of <i>M</i> = 35.9. After the intervention, participants had significantly higher applied knowledge scores (<i>W</i> = 98, |<i>z</i>| = 2.89, <i>p</i> = .004; <i>r</i> = .72), self-reported significantly higher confidence in genomic testing skills (<i>W</i> = 134, |<i>z</i>| = 3.41, <i>p</i> < .001; <i>r</i> = 0.85) and had significantly higher skills performance scores (<i>W</i> = 107, |<i>z</i>| = 2.02, <i>p</i> = .044; <i>r</i> = .50).</p><p><strong>Conclusion: </strong>: A well-designed learning intervention in genomic testing significantly improved applied knowledge, self-efficacy and skills performance in primary care. These findings underscore the importance of structured CE programmes, highlighting instructional design as a key factor in optimising learning outcomes.</p>","PeriodicalId":73675,"journal":{"name":"Journal of CME","volume":"14 1","pages":"2526234"},"PeriodicalIF":0.0000,"publicationDate":"2025-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12281646/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of CME","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/28338073.2025.2526234","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Background: : Quality continuing education (CE) interventions should be effective, fit-for-purpose, and flexible for healthcare professionals. However, variability in the quality of reported interventions limits their impact. Education providers must ensure well-designed learning experiences to maximise efficiency and relevance. This study details the systematic design of a genomic testing learning intervention, incorporating practical exercises and aligning with educational principles to evaluate its impact on knowledge acquisition, self-efficacy, and skills performance.
Methods: : The intervention, conducted in a skills laboratory in Bern, Switzerland, included an interactive online learning module based on learning science principles. Participants engaged in simulated patient (SP) encounters to apply their skills, followed by an informal debriefing session with SPs and content experts. A pre-test post-test study design measured applied knowledge (patient scenario test), self-efficacy (confidence ratings), and skills performance (SP assessments). Wilcoxon tests assessed improvements, Mann-Whitney U tests identified group differences, and Pearson's r calculated effect sizes.
Results: : Sixteen participants enrolled, including general practitioners (n = 8) and 4th year medical students (n = 8). In total, the balance of female/male participants was 9(=female)/7(=male), with an overall age of M = 35.9. After the intervention, participants had significantly higher applied knowledge scores (W = 98, |z| = 2.89, p = .004; r = .72), self-reported significantly higher confidence in genomic testing skills (W = 134, |z| = 3.41, p < .001; r = 0.85) and had significantly higher skills performance scores (W = 107, |z| = 2.02, p = .044; r = .50).
Conclusion: : A well-designed learning intervention in genomic testing significantly improved applied knowledge, self-efficacy and skills performance in primary care. These findings underscore the importance of structured CE programmes, highlighting instructional design as a key factor in optimising learning outcomes.