{"title":"Commentary: Modernizing Educational Assessment Training for Changing Job Markets","authors":"André A. Rupp","doi":"10.1111/emip.12629","DOIUrl":"10.1111/emip.12629","url":null,"abstract":"","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"33-38"},"PeriodicalIF":2.7,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Andrew D. Ho, Terry A. Ackerman, Deborah L. Bandalos, Derek C. Briggs, Howard T. Everson, Susan M. Lottridge, Matthew J. Madison, Sandip Sinharay, Michael C. Rodriguez, Michael Russell, Alina A. von Davier, Stefanie A. Wind
{"title":"Foundational Competencies in Educational Measurement: A Rejoinder","authors":"Andrew D. Ho, Terry A. Ackerman, Deborah L. Bandalos, Derek C. Briggs, Howard T. Everson, Susan M. Lottridge, Matthew J. Madison, Sandip Sinharay, Michael C. Rodriguez, Michael Russell, Alina A. von Davier, Stefanie A. Wind","doi":"10.1111/emip.12623","DOIUrl":"10.1111/emip.12623","url":null,"abstract":"<p>What are foundational competencies in educational measurement? We published a framework for these foundational competencies in this journal (Ackerman et al. 2024) and were grateful to receive eight commentaries raising a number of important questions about the framework and its implications. We identified five cross-cutting recommendations among the eight commentaries relating to (1) our process and purpose, (2) Artificial Intelligence, (3) ethical competencies, (4) qualitative, critical, and culturally responsive commentaries, and (5) intersecting professions in, for example, classroom assessment and content development. In this rejoinder, we respond to these five recommendations and to each of the eight commentaries in turn. We hope that discussion and consensus about foundational competencies in educational measurement continue to advance in our journals and our field.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"56-63"},"PeriodicalIF":2.7,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dustin S. J. Van Orman, Janine A. Jackson, Thao T. Vo, Darius D. Taylor
{"title":"Commentary: Perspectives of Early Career Professionals on Enhancing Cultural Responsiveness in Educational Measurement","authors":"Dustin S. J. Van Orman, Janine A. Jackson, Thao T. Vo, Darius D. Taylor","doi":"10.1111/emip.12628","DOIUrl":"10.1111/emip.12628","url":null,"abstract":"<p>The “Foundational Competencies in Educational Measurement” framework aims to shape the field's future. However, the absence of emerging scholars and graduate students in the task force highlights a gap in representing those most familiar with current educational landscape. As early career scholars, we offer perspectives to enhance this framework by focusing on making educational measurement more inclusive, collaborative, and culturally responsive. Drawing on our diverse backgrounds and experiences, we propose expanding the framework to empower measurement professionals, diversify measurement practices, and integrate ethical considerations. We also advocate for a new visual representation of the framework as a budding plant, symbolizing the organic and evolving nature of foundational skills in educational measurement. This commentary aims to refine the foundational competencies to better prepare future professionals for meaningful, equitable educational contributions.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"27-32"},"PeriodicalIF":2.7,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12628","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142211678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improving Instructional Decision‐Making Using Diagnostic Classification Models","authors":"W. Jake Thompson, Amy K. Clark","doi":"10.1111/emip.12619","DOIUrl":"https://doi.org/10.1111/emip.12619","url":null,"abstract":"In recent years, educators, administrators, policymakers, and measurement experts have called for assessments that support educators in making better instructional decisions. One promising approach to measurement to support instructional decision‐making is diagnostic classification models (DCMs). DCMs are flexible psychometric models that facilitate fine‐grained reporting on skills that students have mastered. In this article, we describe how DCMs can be leveraged to support better decision‐making. We first provide a high‐level overview of DCMs. We then describe different methods for reporting results from DCM‐based assessments that support decision‐making for different stakeholder groups. We close with a discussion of considerations for implementing DCMs in an operational setting, including how they can inform decision‐making at state and local levels, and share future directions for research.","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"28 1","pages":""},"PeriodicalIF":2.0,"publicationDate":"2024-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141509562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Bharati B. Belwalkar, Matthew Schultz, Christina Curnow, J. Carl Setzer
{"title":"Blending Strategic Expertise and Technology: A Case Study for Practice Analysis","authors":"Bharati B. Belwalkar, Matthew Schultz, Christina Curnow, J. Carl Setzer","doi":"10.1111/emip.12607","DOIUrl":"10.1111/emip.12607","url":null,"abstract":"<p>There is a growing integration of technology in the workplace (World Economic Forum), and with it, organizations are increasingly relying on advanced technological approaches for improving their human capital processes to stay relevant and competitive in complex environments. All professions must keep up with this transition and begin integrating technology into their tools and processes. This paper centers on how advanced technological approaches (such as natural language processing (NLP) and data mining) have complemented a traditional practice analysis of the accounting profession. We also discuss strategic selection and use of subject-matter experts (SMEs) for more efficient practice analysis. The authors have adopted a triangulation process—gathering information from traditional practice analysis, using selected SMEs, and confirming findings with a novel NLP-based approach. These methods collectively contributed to the revision of the Uniform CPA Exam blueprint and in understanding accounting trends.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"85-94"},"PeriodicalIF":2.7,"publicationDate":"2024-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141122285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"2023 NCME Presidential Address: Some Musings on Comparable Scores","authors":"Deborah J. Harris","doi":"10.1111/emip.12609","DOIUrl":"10.1111/emip.12609","url":null,"abstract":"<p>This article is based on my 2023 NCME Presidential Address, where I talked a bit about my journey into the profession, and more substantively about comparable scores. Specifically, I discussed some of the different ways ‘comparable scores’ are defined, highlighted some areas I think we as a profession need to pay more attention to when considering score comparability, and emphasized that comparability in this context is a matter of degree which varies according to the decisions we plan to make on particular scores.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 2","pages":"6-15"},"PeriodicalIF":2.0,"publicationDate":"2024-05-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12609","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140929271","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Examining Gender Differences in TIMSS 2019 Using a Multiple-Group Hierarchical Speed-Accuracy-Revisits Model","authors":"Dihao Leng, Ummugul Bezirhan, Lale Khorramdel, Bethany Fishbein, Matthias von Davier","doi":"10.1111/emip.12606","DOIUrl":"10.1111/emip.12606","url":null,"abstract":"<p>This study capitalizes on response and process data from the computer-based TIMSS 2019 Problem Solving and Inquiry tasks to investigate gender differences in test-taking behaviors and their association with mathematics achievement at the eighth grade. Specifically, a recently proposed hierarchical speed-accuracy-revisits (SAR) model was adapted to multiple country-by-gender groups to examine the extent to which mathematics ability, response speed, revisit propensity, and the relationship among them differ between boys and girls. Results across 10 countries showed that boys responded to items faster on average than girls, and there was greater variation in boys’ response speed across students. A mixture distribution of revisit propensity was found for all country-by-gender groups. Both genders had moderate to strong negative correlations between mathematics ability and response speed, supporting the speed-accuracy tradeoff pattern reported in the literature. Results are discussed in the context of low-stakes assessments and in relation to the utility of the multiple-group SAR model.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"64-75"},"PeriodicalIF":2.7,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12606","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140663098","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Guesses and Slips as Proficiency-Related Phenomena and Impacts on Parameter Invariance","authors":"Xiangyi Liao, Daniel M Bolt","doi":"10.1111/emip.12605","DOIUrl":"10.1111/emip.12605","url":null,"abstract":"<p>Traditional approaches to the modeling of multiple-choice item response data (e.g., 3PL, 4PL models) emphasize slips and guesses as random events. In this paper, an item response model is presented that characterizes both disjunctively interacting guessing and conjunctively interacting slipping processes as proficiency-related phenomena. We show how evidence for this perspective is seen in the systematic form of invariance violations for item slip and guess parameters under four-parameter IRT models when compared across populations of different mean proficiency levels. Specifically, higher proficiency populations tend to show higher guess and lower slip probabilities than lower proficiency populations. The results undermine the use of traditional models for IRT applications that require invariance and would suggest greater attention to alternatives.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"43 3","pages":"76-84"},"PeriodicalIF":2.7,"publicationDate":"2024-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12605","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140589673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}