{"title":"Presidential address: reflection on work from April 2019 to 2022 and appreciation to the staff and volunteers.","authors":"Yoon-Seong Lee","doi":"10.3352/jeehp.2022.19.39","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.39","url":null,"abstract":"","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"39"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9922552/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9301251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yu-Fan Lin, Chien-Ying Wang, Yen-Hsun Huang, Sheng-Min Lin, Ying-Ying Yang
{"title":"Medical students’ self-assessed efficacy and satisfaction with training on endotracheal intubation and central venous catheterization with smart glasses in Taiwan: a non-equivalent control-group pre- and post-test study","authors":"Yu-Fan Lin, Chien-Ying Wang, Yen-Hsun Huang, Sheng-Min Lin, Ying-Ying Yang","doi":"10.3352/jeehp.2022.19.25","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.25","url":null,"abstract":"<p><strong>Purpose: </strong>Endotracheal intubation and central venous catheterization are essential procedures in clinical practice. Simulation-based technology such as smart glasses has been used to facilitate medical students’ training on these procedures. We investigated medical students’ self-assessed efficacy and satisfaction regarding the practice and training of these procedures with smart glasses in Taiwan.</p><p><strong>Methods: </strong>This observational study enrolled 145 medical students in the 5th and 6th years participating in clerkships at Taipei Veterans General Hospital between October 2020 and December 2021. Students were divided into the smart glasses or the control group and received training at a workshop. The primary outcomes included students’ pre- and post-intervention scores for self-assessed efficacy and satisfaction with the training tool, instructor’s teaching, and the workshop.</p><p><strong>Results: </strong>The pre-intervention scores for self-assessed efficacy of 5th- and 6th-year medical students in endotracheal intubation and central venous catheterization procedures showed no significant difference. The post-intervention score of self-assessed efficacy in the smart glasses group was better than that of the control group. Moreover, 6th-year medical students in the smart glasses group showed higher satisfaction with the training tool, instructor’s teaching, and workshop than those in the control group.</p><p><strong>Conclusion: </strong>Smart glasses served as a suitable simulation tool for endotracheal intubation and central venous catheterization procedures training in medical students. Medical students practicing with smart glasses showed improved self-assessed efficacy and higher satisfaction with training, especially for procedural steps in a space-limited field. Simulation training on procedural skills with smart glasses in 5th-year medical students may be adjusted to improve their satisfaction.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"25"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9681602/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10751040","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Medical students’ satisfaction level with e-learning during the COVID-19 pandemic and its related factors: a systematic review","authors":"Mahbubeh Tabatabaeichehr, Samane Babaei, Mahdieh Dartomi, Peiman Alesheikh, Amir Tabatabaee, Hamed Mortazavi, Zohreh Khoshgoftar","doi":"10.3352/jeehp.2022.19.37","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.37","url":null,"abstract":"<p><strong>Purpose: </strong>This review investigated medical students’ satisfaction level with e-learning during the coronavirus disease 2019 (COVID-19) pandemic and its related factors.</p><p><strong>Methods: </strong>A comprehensive systematic search was performed of international literature databases, including Scopus, PubMed, Web of Science, and Persian databases such as Iranmedex and Scientific Information Database using keywords extracted from Medical Subject Headings such as “Distance learning,” “Distance education,” “Online learning,” “Online education,” and “COVID-19” from the earliest date to July 10, 2022. The quality of the studies included in this review was evaluated using the appraisal tool for cross-sectional studies (AXIS tool).</p><p><strong>Results: </strong>A total of 15,473 medical science students were enrolled in 24 studies. The level of satisfaction with e-learning during the COVID-19 pandemic among medical science students was 51.8%. Factors such as age, gender, clinical year, experience with e-learning before COVID-19, level of study, adaptation content of course materials, interactivity, understanding of the content, active participation of the instructor in the discussion, multimedia use in teaching sessions, adequate time dedicated to the e-learning, stress perception, and convenience had significant relationships with the satisfaction of medical students with e-learning during the COVID-19 pandemic.</p><p><strong>Conclusion: </strong>Therefore, due to the inevitability of online education and e-learning, it is suggested that educational managers and policymakers choose the best online education method for medical students by examining various studies in this field to increase their satisfaction with e-learning.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"37"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9899548/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10713714","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Factors associated with medical students’ scores on the National Licensing Exam in Peru: a systematic review","authors":"Javier Alejandro Flores Cohaila","doi":"10.3352/jeehp.2022.19.38","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.38","url":null,"abstract":"<p><strong>Purpose: </strong>This study aimed to identify factors that have been studied for their associations with National Licensing Examination (ENAM) scores in Peru.</p><p><strong>Methods: </strong>A search was conducted of literature databases and registers, including EMBASE, SciELO, Web of Science, MEDLINE, Peru’s National Register of Research Work, and Google Scholar. The following key terms were used: “ENAM” and “associated factors.” Studies in English and Spanish were included. The quality of the included studies was evaluated using the Medical Education Research Study Quality Instrument (MERSQI).</p><p><strong>Results: </strong>In total, 38,500 participants were enrolled in 12 studies. Most (11/12) studies were cross-sectional, except for one case-control study. Three studies were published in peer-reviewed journals. The mean MERSQI was 10.33. A better performance on the ENAM was associated with a higher-grade point average (GPA) (n=8), internship setting in EsSalud (n=4), and regular academic status (n=3). Other factors showed associations in various studies, such as medical school, internship setting, age, gender, socioeconomic status, simulations test, study resources, preparation time, learning styles, study techniques, test-anxiety, and self-regulated learning strategies.</p><p><strong>Conclusion: </strong>The ENAM is a multifactorial phenomenon; our model gives students a locus of control on what they can do to improve their score (i.e., implement self-regulated learning strategies) and faculty, health policymakers, and managers a framework to improve the ENAM score (i.e., design remediation programs to improve GPA and integrate anxiety-management courses into the curriculum).</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"38"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9889888/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10719657","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Possibility of independent use of the yes/no Angoff and Hofstee methods for the standard setting of the Korean Medical Licensing Examination written test: a descriptive study.","authors":"Do-Hwan Kim, Ye Ji Kang, Hoon-Ki Park","doi":"10.3352/jeehp.2022.19.33","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.33","url":null,"abstract":"<p><strong>Purpose: </strong>This study aims to apply the yes/no Angoff and Hofstee methods to actual Korean Medical Licensing Examination (KMLE) 2022 written examination data to estimate cut scores for the written KMLE.</p><p><strong>Methods: </strong>Fourteen panelists gathered to derive the cut score of the 86th KMLE written examination data using the yes/no Angoff method. The panel reviewed the items individually before the meeting and shared their respective understanding of the minimum-competency physician. The standard setting process was conducted in 5 rounds over a total of 800 minutes. In addition, 2 rounds of the Hofstee method were conducted before starting the standard setting process and after the second round of yes/no Angoff.</p><p><strong>Results: </strong>For yes/no Angoff, as each round progressed, the panel’s opinion gradually converged to a cut score of 198 points, and the final passing rate was 95.1%. The Hofstee cut score was 208 points out of a maximum 320 with a passing rate of 92.1% at the first round. It scored 204 points with a passing rate of 93.3% in the second round.</p><p><strong>Conclusion: </strong>The difference between the cut scores obtained through yes/no Angoff and Hofstee methods did not exceed 2% points, and they were within the range of cut scores from previous studies. In both methods, the difference between the panelists decreased as rounds were repeated. Overall, our findings suggest the acceptability of cut scores and the possibility of independent use of both methods.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"33"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9845067/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9155882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cesar Orsini, Veena Rodrigues, Jorge Tricio, Margarita Rosel
{"title":"Common models and approaches for the clinical educator to plan effective feedback encounters.","authors":"Cesar Orsini, Veena Rodrigues, Jorge Tricio, Margarita Rosel","doi":"10.3352/jeehp.2022.19.35","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.35","url":null,"abstract":"<p><p>Giving constructive feedback is crucial for learners to bridge the gap between their current performance and the desired standards of\u0000competence. Giving effective feedback is a skill that can be learned, practiced, and improved. Therefore, our aim was to explore models\u0000in clinical settings and assess their transferability to different clinical feedback encounters. We identified the 6 most common and accepted feedback models, including the Feedback Sandwich, the Pendleton Rules, the One-Minute Preceptor, the SET-GO model, the R2C2 (Rapport/Reaction/Content/Coach), and the ALOBA (Agenda Led Outcome-based Analysis) model. We present a handy resource describing their structure, strengths and weaknesses, requirements for educators and learners, and suitable feedback encounters\u0000for use for each model. These feedback models represent practical frameworks for educators to adopt but also to adapt to their preferred style, combining and modifying them if necessary to suit their needs and context.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"35"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9842479/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10755916","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Equal Z standard-setting method to estimate the minimum number of panelists for a medical school’s objective structured clinical examination in Taiwan: a simulation study","authors":"Ying-Ying Yang, Pin-Hsiang Huang, Ling-Yu Yang, Chia-Chang Huang, Chih-Wei Liu, Shiau-Shian Huang, Chen-Huan Chen, Fa-Yauh Lee, Shou-Yen Kao, Boaz Shulruf","doi":"10.3352/jeehp.2022.19.27","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.27","url":null,"abstract":"<p><strong>Purpose: </strong>Undertaking a standard-setting exercise is a common method for setting pass/fail cut scores for high-stakes examinations. The recently introduced equal Z standard-setting method (EZ method) has been found to be a valid and effective alternative for the commonly used Angoff and Hofstee methods and their variants. The current study aims to estimate the minimum number of panelists required for obtaining acceptable and reliable cut scores using the EZ method.</p><p><strong>Methods: </strong>The primary data were extracted from 31 panelists who used the EZ method for setting cut scores for a 12-station of medical school’s final objective structured clinical examination (OSCE) in Taiwan. For this study, a new data set composed of 1,000 random samples of different panel sizes, ranging from 5 to 25 panelists, was established and analyzed. Analysis of variance was performed to measure the differences in the cut scores set by the sampled groups, across all sizes within each station.</p><p><strong>Results: </strong>On average, a panel of 10 experts or more yielded cut scores with confidence more than or equal to 90% and 15 experts yielded cut scores with confidence more than or equal to 95%. No significant differences in cut scores associated with panel size were identified for panels of 5 or more experts.</p><p><strong>Conclusion: </strong>The EZ method was found to be valid and feasible. Less than an hour was required for 12 panelists to assess 12 OSCE stations. Calculating the cut scores required only basic statistical skills.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"27"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9764018/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10823639","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Suggestion of more suitable study designs and the corresponding reporting guidelines in articles published in the Journal of Educational Evaluation for Health Professions from 2021 to September 2022: a descriptive study.","authors":"Soo Young Kim","doi":"10.3352/jeehp.2022.19.36","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.36","url":null,"abstract":"<p><strong>Purpose: </strong>This study aimed to suggest a more suitable study design and the corresponding reporting guidelines in the papers published in the Journal of Educational Evaluation for Health Professionals from January 2021 to September 2022.</p><p><strong>Methods: </strong>Among 59 papers published in the Journal of Educational Evaluation for Health Professionals from January 2021 to September 2022, research articles, review articles, and brief reports were selected. The followings were analyzed: first, the percentage of articles describing the study design in the title, abstracts, or methods; second, the portion of articles describing reporting guidelines; third, the\u0000types of study design and corresponding reporting guidelines; and fourth, the suggestion of a more suitable study design based on the study design algorithm for medical literature on interventions, systematic reviews & other review types, and epidemiological studies overview.</p><p><strong>Results: </strong>Out of 45 articles, 44 described study designs (97.8%). Out of 44, 19 articles were suggested to be described with more suitable study designs, which mainly occurred in before-and-after studies, diagnostic research, and non-randomized trials. Of the 18 reporting guidelines mentioned, 8 (44.4%) were considered perfect. STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) was used for descriptive studies, before-and-after studies, and randomized controlled trials; however, its use should be reconsidered.</p><p><strong>Conclusion: </strong>Some declarations of study design and reporting guidelines were suggested to be described with more suitable ones. Education and training on study design and reporting guidelines for researchers are needed, and reporting guideline policies for descriptive studies should also be implemented.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"36"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9889887/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10719208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Is online objective structured clinical examination teaching an acceptable replacement in post-COVID-19 medical education in the United Kingdom?: a descriptive study","authors":"Vashist Motkur, Aniket Bharadwaj, Nimalesh Yogarajah","doi":"10.3352/jeehp.2022.19.30","DOIUrl":"https://doi.org/10.3352/jeehp.2022.19.30","url":null,"abstract":"<p><strong>Purpose: </strong>Coronavirus disease 2019 (COVID-19) restrictions resulted in an increased emphasis on virtual communication in medical education. This study assessed the acceptability of virtual teaching in an online objective structured clinical examination (OSCE) series and its role in future education.</p><p><strong>Methods: </strong>Six surgical OSCE stations were designed, covering common surgical topics, with specific tasks testing data interpretation, clinical knowledge, and communication skills. These were delivered via Zoom to students who participated in student/patient/examiner role-play. Feedback was collected by asking students to compare online teaching with previous experiences of in-person teaching. Descriptive statistics were used for Likert response data, and thematic analysis for free-text items.</p><p><strong>Results: </strong>Sixty-two students provided feedback, with 81% of respondents finding online instructions preferable to paper equivalents. Furthermore, 65% and 68% found online teaching more efficient and accessible, respectively, than in-person teaching. Only 34% found communication with each other easier online; Forty percent preferred online OSCE teaching to in-person teaching. Students also expressed feedback in positive and negative free-text comments.</p><p><strong>Conclusion: </strong>The data suggested that generally students were unwilling for online teaching to completely replace in-person teaching. The success of online teaching was dependent on the clinical skill being addressed; some were less amenable to a virtual setting. However, online OSCE teaching could play a role alongside in-person teaching.</p>","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"19 ","pages":"30"},"PeriodicalIF":4.4,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9807458/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10512389","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Presidential address: Quarantine guideline to protect examinees from COVID-19, clinical skill examination for dental licensing, and computer-based testing for medical, dental, and oriental medicine licensing","authors":"Yoon-Seong Lee","doi":"10.3352/jeehp.2020.18.1","DOIUrl":"https://doi.org/10.3352/jeehp.2020.18.1","url":null,"abstract":"","PeriodicalId":46098,"journal":{"name":"Journal of Educational Evaluation for Health Professions","volume":"1 1","pages":""},"PeriodicalIF":4.4,"publicationDate":"2021-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47002406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}