Michael J. Peeters , M. Ken Cor , Ashley N. Castleberry , Michael J. Gonyeau
{"title":"Comparing Holistic and Mixed-Approach Rubrics for Academic Poster Quality","authors":"Michael J. Peeters , M. Ken Cor , Ashley N. Castleberry , Michael J. Gonyeau","doi":"10.1016/j.ajpe.2025.101379","DOIUrl":null,"url":null,"abstract":"<div><h3>Objective</h3><div>Poster quality at academic conferences has varied. Furthermore, the few poster-quality rubrics in the literature have limited psychometric evidence. Thus, we compared holistic vs mixed-approach scoring using a recently created poster rubric, scored by multiple raters, to evaluate validation evidence and time-to-score utility.</div></div><div><h3>Methods</h3><div>Sixty research posters were randomly selected from an academic conference’s online poster repository. Using a previously created rubric (and without rubric training), 4 pharmacy education faculty members with varying levels of poster-related experience scored each poster. Initially, each rater holistically scored the posters, providing a single overall score for each. Approximately 1 month later, the raters scored the posters again using a mixed approach, assigning 4 sub-scores and a new overall score. We used the Generalizability Theory to assess the effect of rater experience and the Rasch Measurement Model to examine rating scale effectiveness and construct validation. Time-to-score for each poster was also compared.</div></div><div><h3>Results</h3><div>Generalizability Theory showed greater reliability with more experienced raters or when using the mixed approach. Rasch analysis indicated that rating scales functioned better with the mixed approach, and Wright maps of the construct provided useful measurement validation evidence. Raters reported scoring more quickly (30–60 s per poster) with holistic scoring, though differences in rater experience affected reliability. Meanwhile, mixed-approach scoring was slightly slower (60–90 s per poster), but the impact of the rater experience was reduced.</div></div><div><h3>Conclusion</h3><div>Scoring was slightly faster with the holistic approach than with the mixed-approach rubric; however, differences in rater experience were lessened using the mixed-approach. The mixed approach was preferable because it allowed for quick scoring while reducing the need for prior training. This rubric could be used by students and new faculty when creating posters or by poster-competition judges. Furthermore, mixed-approach rubrics may be applied beyond posters, including oral presentations or objective structured clinical examination stations.</div></div>","PeriodicalId":55530,"journal":{"name":"American Journal of Pharmaceutical Education","volume":"89 4","pages":"Article 101379"},"PeriodicalIF":3.8000,"publicationDate":"2025-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"American Journal of Pharmaceutical Education","FirstCategoryId":"95","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0002945925000245","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION, SCIENTIFIC DISCIPLINES","Score":null,"Total":0}
引用次数: 0
Abstract
Objective
Poster quality at academic conferences has varied. Furthermore, the few poster-quality rubrics in the literature have limited psychometric evidence. Thus, we compared holistic vs mixed-approach scoring using a recently created poster rubric, scored by multiple raters, to evaluate validation evidence and time-to-score utility.
Methods
Sixty research posters were randomly selected from an academic conference’s online poster repository. Using a previously created rubric (and without rubric training), 4 pharmacy education faculty members with varying levels of poster-related experience scored each poster. Initially, each rater holistically scored the posters, providing a single overall score for each. Approximately 1 month later, the raters scored the posters again using a mixed approach, assigning 4 sub-scores and a new overall score. We used the Generalizability Theory to assess the effect of rater experience and the Rasch Measurement Model to examine rating scale effectiveness and construct validation. Time-to-score for each poster was also compared.
Results
Generalizability Theory showed greater reliability with more experienced raters or when using the mixed approach. Rasch analysis indicated that rating scales functioned better with the mixed approach, and Wright maps of the construct provided useful measurement validation evidence. Raters reported scoring more quickly (30–60 s per poster) with holistic scoring, though differences in rater experience affected reliability. Meanwhile, mixed-approach scoring was slightly slower (60–90 s per poster), but the impact of the rater experience was reduced.
Conclusion
Scoring was slightly faster with the holistic approach than with the mixed-approach rubric; however, differences in rater experience were lessened using the mixed-approach. The mixed approach was preferable because it allowed for quick scoring while reducing the need for prior training. This rubric could be used by students and new faculty when creating posters or by poster-competition judges. Furthermore, mixed-approach rubrics may be applied beyond posters, including oral presentations or objective structured clinical examination stations.
期刊介绍:
The Journal accepts unsolicited manuscripts that have not been published and are not under consideration for publication elsewhere. The Journal only considers material related to pharmaceutical education for publication. Authors must prepare manuscripts to conform to the Journal style (Author Instructions). All manuscripts are subject to peer review and approval by the editor prior to acceptance for publication. Reviewers are assigned by the editor with the advice of the editorial board as needed. Manuscripts are submitted and processed online (Submit a Manuscript) using Editorial Manager, an online manuscript tracking system that facilitates communication between the editorial office, editor, associate editors, reviewers, and authors.
After a manuscript is accepted, it is scheduled for publication in an upcoming issue of the Journal. All manuscripts are formatted and copyedited, and returned to the author for review and approval of the changes. Approximately 2 weeks prior to publication, the author receives an electronic proof of the article for final review and approval. Authors are not assessed page charges for publication.