Kamila A. Dell , Gwendolyn A. Wantuch , Neal Benedict , Michael J. Peeters
{"title":"Validity in Question: Is Pharmacy Educational Research Meeting the Standards?","authors":"Kamila A. Dell , Gwendolyn A. Wantuch , Neal Benedict , Michael J. Peeters","doi":"10.1016/j.ajpe.2024.101350","DOIUrl":null,"url":null,"abstract":"<div><h3>Objective</h3><div>We aimed to quantify the rate of validity evidence reporting for educational testing described in pharmacy education literature and determine if this has changed over a 10-year period.</div></div><div><h3>Methods</h3><div>Articles published between 2019 and 2021 in 5 pharmacy education journals were reviewed to identify studies which reported the use of educational testing for knowledge, skills, and/or abilities. Two investigators independently screened and coded each included article for validity evidence sources reported based on the <em>Standards for Educational and Psychological Testing</em>.</div></div><div><h3>Results</h3><div>Overall, 1467 articles were screened, wherein 22% (326/1467) reported use of educational testing. Of the articles included, almost one-third (30%) reported 2 or more sources of validity evidence and another 39% reported only 1 source. Validity evidence for content was most frequently reported (54%), while validity evidence for Internal Structure, including reliability, was reported much less often (17%). Alarmingly, many articles (31%) did not report <em>any</em> source of validity evidence for their student learning assessments. Compared to a decade ago, fewer articles reported validity evidence for their student learning assessments.</div></div><div><h3>Conclusion</h3><div>Despite the critical role of validity evidence in ensuring accurate interpretation of educational testing, its reporting in pharmacy education literature remains inconsistent and has declined over the past decade. Since accurate test score interpretation requires validity evidence, its absence undermines the credibility and impact of research findings. Scholars, journal reviewers, and journal editors should better ensure research reports include validity evidence when inferences are made from learning assessment scores.</div></div>","PeriodicalId":55530,"journal":{"name":"American Journal of Pharmaceutical Education","volume":"89 2","pages":"Article 101350"},"PeriodicalIF":3.8000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Journal of Pharmaceutical Education","FirstCategoryId":"95","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0002945924110698","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0
Abstract
Objective
We aimed to quantify the rate of validity evidence reporting for educational testing described in pharmacy education literature and determine if this has changed over a 10-year period.
Methods
Articles published between 2019 and 2021 in 5 pharmacy education journals were reviewed to identify studies which reported the use of educational testing for knowledge, skills, and/or abilities. Two investigators independently screened and coded each included article for validity evidence sources reported based on the Standards for Educational and Psychological Testing.
Results
Overall, 1467 articles were screened, wherein 22% (326/1467) reported use of educational testing. Of the articles included, almost one-third (30%) reported 2 or more sources of validity evidence and another 39% reported only 1 source. Validity evidence for content was most frequently reported (54%), while validity evidence for Internal Structure, including reliability, was reported much less often (17%). Alarmingly, many articles (31%) did not report any source of validity evidence for their student learning assessments. Compared to a decade ago, fewer articles reported validity evidence for their student learning assessments.
Conclusion
Despite the critical role of validity evidence in ensuring accurate interpretation of educational testing, its reporting in pharmacy education literature remains inconsistent and has declined over the past decade. Since accurate test score interpretation requires validity evidence, its absence undermines the credibility and impact of research findings. Scholars, journal reviewers, and journal editors should better ensure research reports include validity evidence when inferences are made from learning assessment scores.
期刊介绍:
The Journal accepts unsolicited manuscripts that have not been published and are not under consideration for publication elsewhere. The Journal only considers material related to pharmaceutical education for publication. Authors must prepare manuscripts to conform to the Journal style (Author Instructions). All manuscripts are subject to peer review and approval by the editor prior to acceptance for publication. Reviewers are assigned by the editor with the advice of the editorial board as needed. Manuscripts are submitted and processed online (Submit a Manuscript) using Editorial Manager, an online manuscript tracking system that facilitates communication between the editorial office, editor, associate editors, reviewers, and authors.
After a manuscript is accepted, it is scheduled for publication in an upcoming issue of the Journal. All manuscripts are formatted and copyedited, and returned to the author for review and approval of the changes. Approximately 2 weeks prior to publication, the author receives an electronic proof of the article for final review and approval. Authors are not assessed page charges for publication.