Kristin Curry Greenwood, Jennifer L. Kirwin, Zhiguang Huo
{"title":"Design and Implementation of the Health Professions Simulation Assessment, a Tool to Assess Students' Perceptions of Simulation Experiences","authors":"Kristin Curry Greenwood, Jennifer L. Kirwin, Zhiguang Huo","doi":"10.1097/JAT.0000000000000123","DOIUrl":null,"url":null,"abstract":"Background: Simulation is an important educational method in the health professions. While several academic programs have shared simulation quality assessment tools that are intended to be used in a particular discipline, a valid and reliable assessment that can be used by a variety of entry-level health professions education programs is lacking. In order to improve and refine interprofessional simulation programs, a tool that is acceptable to the multiple professions that participate in interprofessional simulation education is needed. The purpose of this study was to design and analyze an evidence-based quality assessment tool that could capture students' perceptions of simulation experiences and could be used by multiple health professions. Subjects: The study included 329 students from different health professions majors who participated as part of their required coursework. Methods: An evidence-based Health Professions Simulation Assessment (HPSA) was created in 2016, pilot tested in 2017, and then disseminated to a larger cohort in 2018. The results of the second dissemination were analyzed using R software to understand the validity and utility of the tool. Results: The response rate for each question was more than 90% and the mean rate of agreement was 79.0% (±8.9%). We observed a high correlation among all pairs of questions (mean 0.51, SD 0.19). In addition, we performed hierarchical clustering and identified 4 clusters of questions that were highly correlated (preparation for experience, self-reflection/emotions, debriefing, and fidelity). Conclusion: An evidence-based tool was created that could be used in a variety of health professions programs to evaluate students' perceptions of the quality of a simulation. This easily administered tool demonstrated satisfactory agreement; the data gathered through its use may be used to improve the quality of simulations in entry-level health professions education programs. This tool was found to be acceptable to multiple professions and could be used in interprofessional student groups to obtain a shared assessment of a simulation. Further research is warranted to determine validity among interprofessional groups of students.","PeriodicalId":42472,"journal":{"name":"Journal of Acute Care Physical Therapy","volume":"11 1","pages":"70 - 78"},"PeriodicalIF":0.5000,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1097/JAT.0000000000000123","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Acute Care Physical Therapy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1097/JAT.0000000000000123","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"REHABILITATION","Score":null,"Total":0}
引用次数: 0
Abstract
Background: Simulation is an important educational method in the health professions. While several academic programs have shared simulation quality assessment tools that are intended to be used in a particular discipline, a valid and reliable assessment that can be used by a variety of entry-level health professions education programs is lacking. In order to improve and refine interprofessional simulation programs, a tool that is acceptable to the multiple professions that participate in interprofessional simulation education is needed. The purpose of this study was to design and analyze an evidence-based quality assessment tool that could capture students' perceptions of simulation experiences and could be used by multiple health professions. Subjects: The study included 329 students from different health professions majors who participated as part of their required coursework. Methods: An evidence-based Health Professions Simulation Assessment (HPSA) was created in 2016, pilot tested in 2017, and then disseminated to a larger cohort in 2018. The results of the second dissemination were analyzed using R software to understand the validity and utility of the tool. Results: The response rate for each question was more than 90% and the mean rate of agreement was 79.0% (±8.9%). We observed a high correlation among all pairs of questions (mean 0.51, SD 0.19). In addition, we performed hierarchical clustering and identified 4 clusters of questions that were highly correlated (preparation for experience, self-reflection/emotions, debriefing, and fidelity). Conclusion: An evidence-based tool was created that could be used in a variety of health professions programs to evaluate students' perceptions of the quality of a simulation. This easily administered tool demonstrated satisfactory agreement; the data gathered through its use may be used to improve the quality of simulations in entry-level health professions education programs. This tool was found to be acceptable to multiple professions and could be used in interprofessional student groups to obtain a shared assessment of a simulation. Further research is warranted to determine validity among interprofessional groups of students.