{"title":"Using Active Learning Methods to Strategically Select Essays for Automated Scoring","authors":"Tahereh Firoozi, Hamid Mohammadi, Mark J. Gierl","doi":"10.1111/emip.12537","DOIUrl":"https://doi.org/10.1111/emip.12537","url":null,"abstract":"<p>Research on Automated Essay Scoring has become increasing important because it serves as a method for evaluating students’ written responses at scale. Scalable methods for scoring written responses are needed as students migrate to online learning environments resulting in the need to evaluate large numbers of written-response assessments. The purpose of this study is to describe and evaluate three active learning methods that can be used to minimize the number of essays that must be scored by human raters while still providing the data needed to train a modern Automated Essay Scoring system. The three active learning methods are the uncertainty-based, the topological-based, and the hybrid method. These three methods were used to select essays included in the Automated Student Assessment Prize competition that were then classified using a scoring model that was trained with the bidirectional encoder representations from a transformer language model. All three active learning methods produced strong results, with the topological-based method producing the most efficient classification. Growth rate accuracy was also evaluated. The active learning methods produced different levels of efficiency under different sample size allocations but, overall, all three methods were highly efficient and produced classifications that were similar to one another.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"42 1","pages":"34-43"},"PeriodicalIF":2.0,"publicationDate":"2022-12-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"50147865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ITEMS Corner Update: High Traffic to the ITEMS Portal on the NCME Website","authors":"Brian C. Leventhal","doi":"10.1111/emip.12532","DOIUrl":"10.1111/emip.12532","url":null,"abstract":"<p>As announced in the previous issue of <i>Educational Measurement: Issues and Practice</i>, the ITEMS portal is now hosted on the NCME website. This shift has many benefits. The modules are now easier to access for the NCME membership. Members can navigate to the portal via the link under the resources tab found on the ribbon at the top of each page on the website. Rather than having to go to an external site with a unique log in, all ITEMS modules are now available under the NCME brand directly on the primary site. The modules can be found:</p><p>https://www.ncme.org/itemsportal</p><p>Being hosted on the NCME website also allows more editorial control of the modules. New modules have an updated form with interactive features built into the browsing experience on the NCME website. Each module begins with a video abstract introducing the objectives learners can expect to achieve by completing the module, as well as an introduction of the authors. The content of the module is broken down into sections, each built around two to four section-specific learning objectives. For each section, authors develop a video of content and interactive learning checks, which are multiple choice items designed to check for understanding. There is an interactive activity for the learner to apply what they have learned in the module. Finally, the slides, sample data sets, example syntax, and other useful resources are available for download.</p><p>Since its launch in September 2022, the ITEMS portal has experienced considerable traffic. In the 30 days between September 12 and October 11, the ITEMS portal amassed just under 1,000 unique page views, with Figure 1 showcasing the daily traffic. At the same time, the original ITEMS portal has continued to remain active, amassing many more views. We are planning on shutting down the original ITEMS portal in the near future. It is important that links to ITEMS modules on the original portal be updated to the URL for the NCME website. Linking to new modules is simple. All modules have the same domain name, top-level domain, and path. All digital modules may be linked using the following URL template, replacing ## with the two-digit digital ITEMS module number: https://www.ncme.org/itemsportal/digital-modules/dm##.</p><p>I am thrilled to announce the second module of the new format on the NCME website. Jennifer Lewis and Steve Sireci author <i>Digital Module #30 Validity and Educational Testing: Purposes and Uses of Educational Tests</i>. In this five-part module, Lewis and Sireci discuss the purposes and uses of educational tests, the basic concepts of validity theory, the five sources of validity evidence, and how to document a “validity argument.” The module contains content that outlines definitions conceptually and provides concrete examples in K–12 testing but will be of use to anyone involved in testing or measurement.</p><p>We have several exciting ITEMS modules in development. There are still opportunities to autho","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"41 4","pages":"79-80"},"PeriodicalIF":2.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12532","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48219216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Digital Module 30: Validity and Educational Testing: Purposes and Uses of Educational Tests","authors":"Jennifer Lewis, Stephen G. Sireci","doi":"10.1111/emip.12533","DOIUrl":"10.1111/emip.12533","url":null,"abstract":"<p>This module is designed for educators, educational researchers, and psychometricians who would like to develop an understanding of the basic concepts of validity theory, test validation, and documenting a “validity argument.” It also describes how an in-depth understanding of the purposes and uses of educational tests sets the foundation for validation. We describe the benefits and limitations of educational tests, the concept of validity and why it is important, and the types of validity evidence that should be used to support the use of a test for a particular purpose. We also discuss the need for assessment programs to provide such evidence and how it should be interpreted and documented to use educational assessments to best serve education.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"41 4","pages":"81-82"},"PeriodicalIF":2.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"47764078","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Ronald K. Hambleton (1943–2022): Setting the Standard for Measurement Excellence","authors":"Stephen G. Sireci","doi":"10.1111/emip.12530","DOIUrl":"10.1111/emip.12530","url":null,"abstract":"","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"41 4","pages":"6-9"},"PeriodicalIF":2.0,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42743650","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yanyan Fu, Edison M. Choe, Hwanggyu Lim, Jaehwa Choi
{"title":"An Evaluation of Automatic Item Generation: A Case Study of Weak Theory Approach","authors":"Yanyan Fu, Edison M. Choe, Hwanggyu Lim, Jaehwa Choi","doi":"10.1111/emip.12529","DOIUrl":"10.1111/emip.12529","url":null,"abstract":"<p>This case study applied the <i>weak theory</i> of Automatic Item Generation (AIG) to generate isomorphic item instances (i.e., unique but psychometrically equivalent items) for a large-scale assessment. Three representative instances were selected from each item template (i.e., model) and pilot-tested. In addition, a new analytical framework, differential child item functioning (DCIF) analysis, based on the existing differential item functioning statistics, was applied to evaluate the psychometric equivalency of item instances within each template. The results showed that, out of 23 templates, nine successfully generated isomorphic instances, five required minor revisions to make them isomorphic, and the remaining templates required major modifications. The results and insights obtained from the AIG template development procedure may help item writers and psychometricians effectively develop and manage the templates that generate isomorphic instances.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"41 4","pages":"10-22"},"PeriodicalIF":2.0,"publicationDate":"2022-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46565878","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ITEMS Corner Update: Announcing Two Significant Changes to ITEMS","authors":"Brian C. Leventhal","doi":"10.1111/emip.12524","DOIUrl":"10.1111/emip.12524","url":null,"abstract":"<p>In addition to an exciting new module on Multidimensional Item Response Theory (MIRT) equating, there are two important announcements regarding the Instructional Topics in Educational Measurement Series (ITEMS). After much discussion with authors, learners, the educational measurement community, and other stakeholders of ITEMS, I am pleased to announce (1) the transfer of the ITEMS portal to the National Council on Measurement in Education (NCME) website and (2) a new digital module format.</p><p><i>Transfer of the ITEMS portal to the NCME website</i>: In 2018, I, along with Matthew Gaertner, led efforts to launch the new NCME website on the Higher Logic platform. Besides bringing a modern look and feel to the organization's web presence, the platform was selected due to its flexibility and customizability. In the years since, traffic to the NCME website has continued to increase, and there has been a significant increase in site content (e.g., software database, special interest group community pages). In April of this year, just prior to the NCME Annual Conference, the website committee, led by Erin Banjanovic, released a much-needed re-organization of the content on the site. This wonderful overhaul has made navigating the NCME website easier with content now in more logical locations. However, noticeably absent from the NCME website has been the ITEMS portal. As a reminder, ITEMS is a publication from NCME that has a brief summary published in the <i>Educational Measurement: Issues and Practice</i> journal with the primary digital content on the ITEMS portal, freely available after registration. The ITEMS portal has been a Learning Management System based website, with many features and ripe for extension. Though from the user's perspective, it can be complex to navigate and necessitates additional navigation from the primary NCME website, also requiring unique log-in criteria.</p><p>It is at this time that I am pleased to announce that the ITEMS portal is now available on the NCME website at the following link: https://www.ncme.org/itemsportal</p><p>Transferring the ITEMS portal to the NCME website has several immediate benefits. First, all modules will remain free of charge, but no longer require additional registration. Second, they will have a different organization structure, improving navigating across modules and enabling more efficient access to key information. Finally, they will fall under the NCME brand, having the same look and feel with all the content on the NCME website.</p><p>Although this issue marks the launch of the ITEMS portal on the NCME website, the transfer of content remains a work in progress. For now, both the old and new ITEMS portal will be available and all links to the old ITEMS portal will remain functional. However, I would strongly advise all who embed or link to content to begin updating to the portal on the NCME website. Nearly all of the content has been shifted, but if you notice anything missing or ha","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"41 3","pages":"83-84"},"PeriodicalIF":2.0,"publicationDate":"2022-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/emip.12524","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42982757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}