{"title":"Patterns of Solution Behavior across Items in Low-Stakes Assessments","authors":"D. Pastor, Thai Q. Ong, S. Strickman","doi":"10.1080/10627197.2019.1615373","DOIUrl":null,"url":null,"abstract":"ABSTRACT The trustworthiness of low-stakes assessment results largely depends on examinee effort, which can be measured by the amount of time examinees devote to items using solution behavior (SB) indices. Because SB indices are calculated for each item, they can be used to understand how examinee motivation changes across items within a test. Latent class analysis (LCA) was used with the SB indices from three low-stakes assessments to explore patterns of solution behavior across items. Across tests, the favored models consisted of two classes, with Class 1 characterized by high and consistent solution behavior (>90% of examinees) and Class 2 by lower and less consistent solution behavior (<10% of examinees). Additional analyses provided supportive validity evidence for the two-class solution with notable differences between classes in self-reported effort, test scores, gender composition, and testing context. Although results were generally similar across the three assessments, striking differences were found in the nature of the solution behavior pattern for Class 2 and the ability of item characteristics to explain the pattern. The variability in the results suggests motivational changes across items may be unique to aspects of the testing situation (e.g., content of the assessment) for less motivated examinees.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":null,"pages":null},"PeriodicalIF":2.1000,"publicationDate":"2019-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/10627197.2019.1615373","citationCount":"26","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Assessment","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/10627197.2019.1615373","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 26
Abstract
ABSTRACT The trustworthiness of low-stakes assessment results largely depends on examinee effort, which can be measured by the amount of time examinees devote to items using solution behavior (SB) indices. Because SB indices are calculated for each item, they can be used to understand how examinee motivation changes across items within a test. Latent class analysis (LCA) was used with the SB indices from three low-stakes assessments to explore patterns of solution behavior across items. Across tests, the favored models consisted of two classes, with Class 1 characterized by high and consistent solution behavior (>90% of examinees) and Class 2 by lower and less consistent solution behavior (<10% of examinees). Additional analyses provided supportive validity evidence for the two-class solution with notable differences between classes in self-reported effort, test scores, gender composition, and testing context. Although results were generally similar across the three assessments, striking differences were found in the nature of the solution behavior pattern for Class 2 and the ability of item characteristics to explain the pattern. The variability in the results suggests motivational changes across items may be unique to aspects of the testing situation (e.g., content of the assessment) for less motivated examinees.
期刊介绍:
Educational Assessment publishes original research and scholarship on the assessment of individuals, groups, and programs in educational settings. It includes theory, methodological approaches and empirical research in the appraisal of the learning and achievement of students and teachers, young children and adults, and novices and experts. The journal reports on current large-scale testing practices, discusses alternative approaches, presents scholarship on classroom assessment practices and includes assessment topics debated at the national level. It welcomes both conceptual and empirical pieces and encourages articles that provide a strong bridge between theory and/or empirical research and the implications for educational policy and/or practice.