Educational Assessment最新文献

筛选
英文 中文
Student Engagement on the National Assessment of Educational Progress (NAEP): A Systematic Review and Meta-Analysis of Extant Research 学生参与国家教育进步评估:现有研究的系统回顾与元分析
IF 1.5
Educational Assessment Pub Date : 2022-03-06 DOI: 10.1080/10627197.2022.2043151
Allison J. LaFave, Josephine Taylor, Amelia M. Barter, Arielle Jacobs
{"title":"Student Engagement on the National Assessment of Educational Progress (NAEP): A Systematic Review and Meta-Analysis of Extant Research","authors":"Allison J. LaFave, Josephine Taylor, Amelia M. Barter, Arielle Jacobs","doi":"10.1080/10627197.2022.2043151","DOIUrl":"https://doi.org/10.1080/10627197.2022.2043151","url":null,"abstract":"ABSTRACT This systematic review examines empirical research about students’ motivation for NAEP in grades 4, 8, and 12 using multiple motivation constructs, including effort, value, and expectancy. Analyses yielded several findings. First, there are stark differences in the perceived importance of doing well on NAEP among students in grades 4 (86%), 8 (59%), and 12 (35%). Second, meta-analyses of descriptive data on the percentage of students who agreed with various expectancy statements (e.g., “I am good at mathematics”) revealed minimal variations across grade level. However, similar meta-analyses of data on the percentage of students who agreed with various value statements (e.g., “I like mathematics”) exposed notable variation across grade levels. Third, domain-specific motivation has a positive, statistically significant relationship with NAEP achievement. Finally, some interventions – particularly financial incentives – may have a modest, positive effect on NAEP achievement.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"27 1","pages":"205 - 228"},"PeriodicalIF":1.5,"publicationDate":"2022-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45663142","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
The Effect of Device Type on Achievement: Evidence from a Quasi-Experimental Design 设备类型对成就的影响:来自准实验设计的证据
IF 1.5
Educational Assessment Pub Date : 2022-03-02 DOI: 10.1080/10627197.2022.2043742
david. rutkowski, Leslie Rutkowski, C. Flores
{"title":"The Effect of Device Type on Achievement: Evidence from a Quasi-Experimental Design","authors":"david. rutkowski, Leslie Rutkowski, C. Flores","doi":"10.1080/10627197.2022.2043742","DOIUrl":"https://doi.org/10.1080/10627197.2022.2043742","url":null,"abstract":"ABSTRACT As more states move to universal computer-based assessments, an emergent issue concerns the effect that device type might have on student results. Although, several research studies have explored device effects, most of these studies focused on the differences between tablets and desktops/laptops. In the current study, we distinguish between different types of devices to better examine the differences. Specifically, we used Indiana state assessment results from grades 3 and 8 and a propensity score weighting method to see if a student took the assessment on another device, would they have received the same score? Our findings suggest that there are significant differences by device type in both grades. In particular, iPad and Chromebook devices produced higher achievement when compared to Mac and PC devices. At the extreme, these differences amounted to close to a third of a standard deviation on the achievement scale.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"27 1","pages":"229 - 246"},"PeriodicalIF":1.5,"publicationDate":"2022-03-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43376651","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Disrupting White Supremacy in Assessment: Toward a Justice-Oriented, Antiracist Validity Framework 在评估中打破白人至上:走向一个以正义为导向的反种族主义效度框架
IF 1.5
Educational Assessment Pub Date : 2022-02-17 DOI: 10.1080/10627197.2022.2042682
Jennifer Randall, David Slomp, Mya Poe, M. Oliveri
{"title":"Disrupting White Supremacy in Assessment: Toward a Justice-Oriented, Antiracist Validity Framework","authors":"Jennifer Randall, David Slomp, Mya Poe, M. Oliveri","doi":"10.1080/10627197.2022.2042682","DOIUrl":"https://doi.org/10.1080/10627197.2022.2042682","url":null,"abstract":"ABSTRACT In this article, we propose a justice-oriented, antiracist validity framework designed to disrupt assessment practices that continue to (re)produce racism through the uncritical promotion of white supremist hegemonic practices. Using anti-Blackness as illustration, we highlight the ways in which racism is introduced, or ignored, in current assessment and validation processes and how an antiracist approach can be enacted. To start our description of the framework, we outline the foundational theories and practices (e.g., critical race theory & antiracist assessment) and justice-based framings, which serve as the base for our framework. We then focus on Kane’s interpretive use argument and Mislevy’s sociocognitive approach and suggest extending them to include an antiracist perspective. To this end, we propose a set of heuristics organized around a validity argument that holds justice-oriented, antiracist theories and practices at its core.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"29 1","pages":"170 - 178"},"PeriodicalIF":1.5,"publicationDate":"2022-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"59626287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 12
Multimodal Tasks to Assess English Learners and Their Peers in Science 评估英语学习者及其科学同行的多模式任务
IF 1.5
Educational Assessment Pub Date : 2022-01-02 DOI: 10.1080/10627197.2022.2028139
Scott E. Grapin, Lorena Llosa
{"title":"Multimodal Tasks to Assess English Learners and Their Peers in Science","authors":"Scott E. Grapin, Lorena Llosa","doi":"10.1080/10627197.2022.2028139","DOIUrl":"https://doi.org/10.1080/10627197.2022.2028139","url":null,"abstract":"ABSTRACT Traditionally, content assessments have been carried out through written language. However, the latest standards in U.S. K-12 education expect all students, including English learners (ELs), to demonstrate their content learning using multiple modalities. This study examined the performance of fifth-grade students at varying levels of English proficiency on four science tasks that elicited responses in visual, written, and oral modalities. Findings revealed that approximately half of students performed differently in visual versus written modalities on each task. However, performance did not consistently favor the visual modality for ELs, likely due to challenges related to visual representation in some areas of science. Additionally, triangulating students’ visual and written responses with their oral responses yielded more accurate interpretations of their science understanding. Collectively, these findings indicate the potential of multimodal assessment for providing more complete and accurate information about what ELs and their peers know and can do in the content areas.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"27 1","pages":"46 - 70"},"PeriodicalIF":1.5,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48582048","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Investigating the Effect of the Programme of Study on University Students’ Perceptions about Assessment 学习计划对大学生评价观念影响的调查研究
IF 1.5
Educational Assessment Pub Date : 2022-01-02 DOI: 10.1080/10627197.2022.2027753
Diana Pereira, I. Cadime, M. Flores, C. Pinheiro, Patrícia Santos
{"title":"Investigating the Effect of the Programme of Study on University Students’ Perceptions about Assessment","authors":"Diana Pereira, I. Cadime, M. Flores, C. Pinheiro, Patrícia Santos","doi":"10.1080/10627197.2022.2027753","DOIUrl":"https://doi.org/10.1080/10627197.2022.2027753","url":null,"abstract":"ABSTRACT This study focuses on the effect of the programme variable on the purposes and effects that students associate with assessment, on the assessment methods used and on the perceived use of assessment. Data were collected in five Portuguese Public Universities through a survey (n = 4144) and focus group (n = 250) with students enrolled in different programmes. Findings point to statistically significant differences in relation to the purpose of assessment, assessment methods most used and perceived use of assessment. The main differences were found in the kinds of methods used in different programmes: Law reported the lowest frequency of the use of collective assessment methods and portfolios, whereas Psychology, Mechanical and Industrial Engineering were the programmes that reported the lowest frequency of use of individual methods. Educational sciences reported more frequency of all types of methods and reported significantly more preference for the use of alternative methods than the remaining programmes. Negative emotions were most associated with assessment by Nursing students and Educational Sciences’ students reported more participation in the assessment process than students from all other programmes. Implications of the findings are discussed.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"27 1","pages":"71 - 92"},"PeriodicalIF":1.5,"publicationDate":"2022-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48672623","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Language Matters: Teacher and Parent Perceptions of Achievement Labels from Educational Tests 语言问题:教师和家长对教育测试成绩标签的认知
IF 1.5
Educational Assessment Pub Date : 2021-12-29 DOI: 10.1080/10627197.2021.2016388
Francis O’Donnell, S. Sireci
{"title":"Language Matters: Teacher and Parent Perceptions of Achievement Labels from Educational Tests","authors":"Francis O’Donnell, S. Sireci","doi":"10.1080/10627197.2021.2016388","DOIUrl":"https://doi.org/10.1080/10627197.2021.2016388","url":null,"abstract":"ABSTRACT Since the standards-based assessment practices required by the No Child Left Behind legislation, almost all students in the United States are “labeled” according to their performance on educational achievement tests. In spite of their widespread use in reporting test results, research on how achievement level labels are perceived by teachers, parents, and students is minimal. In this study, we surveyed teachers (N = 51) and parents (N = 50) regarding their perceptions of 73 achievement labels (e.g., inadequate, level 2, proficient) used in statewide testing programs. These teachers and parents also sorted the labels according to their similarity. Using multidimensional scaling, we found labels used to denote the same level of performance (e.g., basic and below proficient) were perceived to differ in important ways, including in their tone and how much achievement they convey. Additionally, some labels were perceived as more encouraging or clear than others. Teachers’ and parents’ perceptions were similar, with a few exceptions. The results have important implications for reporting results that encourage, rather than discourage, student learning.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"27 1","pages":"1 - 26"},"PeriodicalIF":1.5,"publicationDate":"2021-12-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46569614","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Investigating the Effects of Test Accommodations with Process Data for English Learners in a Mathematics Assessment 用过程数据研究考试调整对英语学习者数学评估的影响
IF 1.5
Educational Assessment Pub Date : 2021-09-29 DOI: 10.1080/10627197.2021.1982693
M. Wolf, Hanwook Yoo, Danielle Guzman-Orth, J. Abedi
{"title":"Investigating the Effects of Test Accommodations with Process Data for English Learners in a Mathematics Assessment","authors":"M. Wolf, Hanwook Yoo, Danielle Guzman-Orth, J. Abedi","doi":"10.1080/10627197.2021.1982693","DOIUrl":"https://doi.org/10.1080/10627197.2021.1982693","url":null,"abstract":"ABSTRACT Implementing a randomized controlled trial design, the present study investigated the effects of two types of accommodations, linguistic modification and a glossary, for English learners (ELs) taking a computer-based mathematics assessment. Process data including response time and clicks on glossary words were also examined to better interpret students’ interaction with the accommodations in the testing conditions. Regression and ANOVA analyses were performed with data from 513 students (189 ELs and 324 non-ELs) in Grade 9. No statistically significant accommodation effects were detected in this study. Process data revealed possible explanations (i.e., student engagement and glossary usage) for the nonsignificant results. Implications for future research on test accommodations for EL students are discussed.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"27 1","pages":"27 - 45"},"PeriodicalIF":1.5,"publicationDate":"2021-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45298505","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A Methodology for Determining and Validating Latent Factor Dimensionality of Complex Multi-Factor Science Constructs Measuring Knowledge-In-Use 复杂多因素科学结构中潜在因素维度的确定与验证方法
IF 1.5
Educational Assessment Pub Date : 2021-09-05 DOI: 10.1080/10627197.2021.1971966
Leonora Kaldaras, Hope O. Akaeze, J. Krajcik
{"title":"A Methodology for Determining and Validating Latent Factor Dimensionality of Complex Multi-Factor Science Constructs Measuring Knowledge-In-Use","authors":"Leonora Kaldaras, Hope O. Akaeze, J. Krajcik","doi":"10.1080/10627197.2021.1971966","DOIUrl":"https://doi.org/10.1080/10627197.2021.1971966","url":null,"abstract":"ABSTRACT Deep science understanding is reflected in students’ ability to use content and skills when making sense of the world. Assessing deep understanding requires measuring complex constructs that combine elements of content and skills. To develop valid measures of complex constructs, we need to understand how their theoretical dimensionality, reflected in the integration of content and skills, is manifested in practice. This work is developed in the context of the Framework for K-12 Science Education and Next-Generation Science Standards (NGSS). We introduce a methodology that describes steps for creating a theoretical validity argument for measuring complex NGSS constructs, designing operational assessments based on this argument, and obtaining empirical evidence for the validity of the argument and assessments, focusing on how theoretically suggested dimensionality of NGSS constructs is manifested in practice. Results have implications for developing valid NGSS assessments and reporting student progress on high-stakes and diagnostic evaluation.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"26 1","pages":"241 - 263"},"PeriodicalIF":1.5,"publicationDate":"2021-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45735723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Assessing Source Evaluation Skills of Middle School Students Using Learning Progressions 利用学习进展评估中学生源评价技能
IF 1.5
Educational Assessment Pub Date : 2021-09-01 DOI: 10.1080/10627197.2021.1966299
Jesse R. Sparks, P. V. van Rijn, P. Deane
{"title":"Assessing Source Evaluation Skills of Middle School Students Using Learning Progressions","authors":"Jesse R. Sparks, P. V. van Rijn, P. Deane","doi":"10.1080/10627197.2021.1966299","DOIUrl":"https://doi.org/10.1080/10627197.2021.1966299","url":null,"abstract":"ABSTRACT Effectively evaluating the credibility and accuracy of multiple sources is critical for college readiness. We developed 24 source evaluation tasks spanning four predicted difficulty levels of a hypothesized learning progression (LP) and piloted these tasks to evaluate the utility of an LP-based approach to designing formative literacy assessments. Sixth, seventh, and eighth grade students (N = 360, 120 per grade) completed 12 of the 24 tasks in an online testing session. Analyses examined the tasks’ reliability and validity and whether patterns of performance aligned to predicted LP levels (i.e., recovery of the LP) using task progression maps derived from item response theory (IRT). Results suggested that the LP tasks were reliable and correlated with external measures; however, some lower level tasks proved unexpectedly difficult. Possible explanations for low performance are discussed, followed by implications for future LP and task revisions. This work provides a model for designing and evaluating LP-based literacy assessments.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"26 1","pages":"213 - 240"},"PeriodicalIF":1.5,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41628487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
An Intersectional Approach to DIF: Do Initial Findings Hold across Tests? DIF的跨部门方法:初始发现在测试中有效吗?
IF 1.5
Educational Assessment Pub Date : 2021-08-22 DOI: 10.1080/10627197.2021.1965473
M. Russell, Olivia Szendey, Larry Kaplan
{"title":"An Intersectional Approach to DIF: Do Initial Findings Hold across Tests?","authors":"M. Russell, Olivia Szendey, Larry Kaplan","doi":"10.1080/10627197.2021.1965473","DOIUrl":"https://doi.org/10.1080/10627197.2021.1965473","url":null,"abstract":"ABSTRACT Differential Item Function (DIF) analysis is commonly employed to examine potential bias produced by a test item. Since its introduction DIF analyses have focused on potential bias related to broad categories of oppression, including gender, racial stratification, economic class, and ableness. More recently, efforts to examine the effects of oppression on valued life-outcomes have employed an intersectional approach to more fully represent a person’s identity and capture the multiple, and often compound, impacts of oppression. The study presented here replicated an intersectional approach to DIF analyses to examine whether findings from a previous study that focused on a single grade-level achievement test generalized to other subject areas and grade levels. Findings indicate that the use of an intersectional approach is more sensitive to detecting potential item bias and that this increased sensitivity holds across the subject areas and grade levels examined.","PeriodicalId":46209,"journal":{"name":"Educational Assessment","volume":"26 1","pages":"284 - 298"},"PeriodicalIF":1.5,"publicationDate":"2021-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42202501","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信