{"title":"The use of process data in large-scale assessments: a literature review","authors":"Ella Anghel, Lale Khorramdel, Matthias von Davier","doi":"10.1186/s40536-024-00202-1","DOIUrl":"https://doi.org/10.1186/s40536-024-00202-1","url":null,"abstract":"<p>As the use of process data in large-scale educational assessments is becoming more common, it is clear that data on examinees’ test-taking behaviors can illuminate their performance, and can have crucial ramifications concerning assessments’ validity. A thorough review of the literature in the field may inform researchers and practitioners of common findings as well as existing gaps. This literature review used topic modeling to identify themes in 221 empirical studies using process data in large-scale assessments. We identified six recurring topics: response time models, response time-general, aberrant test-taking behavior, action sequences, complex problem-solving, and digital writing. We also discuss the prominent theories used by studies in each category. Based on these findings, we suggest directions for future research applying process data from large-scale assessments.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"12 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140882012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jinnie Shin, Bowen Wang, Wallace N. Pinto Junior, Mark J. Gierl
{"title":"An engagement-aware predictive model to evaluate problem-solving performance from the study of adult skills' (PIAAC 2012) process data","authors":"Jinnie Shin, Bowen Wang, Wallace N. Pinto Junior, Mark J. Gierl","doi":"10.1186/s40536-024-00194-y","DOIUrl":"https://doi.org/10.1186/s40536-024-00194-y","url":null,"abstract":"<h3>Abstract</h3> <p>The benefits of incorporating process information in a large-scale assessment with the complex micro-level evidence from the examinees (i.e., process log data) are well documented in the research across large-scale assessments and learning analytics. This study introduces a deep-learning-based approach to predictive modeling of the examinee’s performance in sequential, interactive problem-solving tasks from a large-scale assessment of adults' educational competencies. The current methods disambiguate problem-solving behaviors using network analysis to inform the examinee's performance in a series of problem-solving tasks. The unique contribution of this framework lies in the introduction of an “effort-aware” system. The system considers the information regarding the examinee’s task-engagement level to accurately predict their task performance. The study demonstrates the potential to introduce a high-performing deep learning model to learning analytics and examinee performance modeling in a large-scale problem-solving task environment collected from the OECD Programme for the International Assessment of Adult Competencies (PIAAC 2012) test in multiple countries, including the United States, South Korea, and the United Kingdom. Our findings indicated a close relationship between the examinee's engagement level and their problem-solving skills as well as the importance of modeling them together to have a better measure of students’ problem-solving performance.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"6 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140017973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The potential of international large-scale assessments for meta-analyses in education","authors":"Ronny Scherer, Fazilat Siddiq, Trude Nilsen","doi":"10.1186/s40536-024-00191-1","DOIUrl":"https://doi.org/10.1186/s40536-024-00191-1","url":null,"abstract":"<p>Meta-analyses and international large-scale assessments (ILSA) are key sources for informing educational policy, research, and practice. While many critical research questions could be addressed by drawing evidence from both of these sources, meta-analysts seldom integrate ILSAs, and current integration practices lack methodological guidance. The aim of this methodological review is therefore to synthesize and illustrate the principles and practices of including ILSA data in meta-analyses. Specifically, we (a) review four ILSA data inclusion approaches (analytic steps, potential, challenges); (b) examine whether and how existing meta-analyses included ILSA data; and (c) provide a hands-on illustrative example of how to implement the four approaches. Seeing the need for meta-analyses on educational inequalities, we situated the review and illustration in the context of gender differences and socioeconomic gaps in student achievement. Ultimately, we outline the steps meta-analysts could take to utilize the potential and address the challenges of ILSA data for meta-analyses in education.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"13 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139763576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Examining successful and unsuccessful time management through process data: two novel indicators of test-taking behaviors","authors":"Elena C. Papanastasiou, Michalis P. Michaelides","doi":"10.1186/s40536-024-00193-z","DOIUrl":"https://doi.org/10.1186/s40536-024-00193-z","url":null,"abstract":"<p>Test-taking behavior is a potential source of construct irrelevant variance for test scores in international large-scale assessments where test-taking effort, motivation, and behaviors in general tend to be confounded with test scores. In an attempt to disentangle this relationship and gain further insight into examinees’ test-taking processes, researchers can now utilize process and timing data to obtain a more comprehensive view of test-taking behaviors, such as test-taking effort. The purpose of this study is to propose and evaluate two novel response-based, standardized indicators of test-taking behaviors that utilize a combination of examinee response and process (timing) data to better understand and describe test-taking effort in ILSAs. These indices were empirically estimated with USA data from two booklets from e-TIMSS 2019 in mathematics for grade 4. In addition, their predictive validity was examined with respect to achievement estimates. Their network of associations with other relevant variables such as motivation, interest in the subject, as well as across subjects were also examined to test their intra-individual stability in e-TIMSS.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"264 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-02-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139763710","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Bayesian workflow for the analysis and reporting of international large-scale assessments: a case study using the OECD teaching and learning international survey","authors":"David Kaplan, Kjorte Harra","doi":"10.1186/s40536-023-00189-1","DOIUrl":"https://doi.org/10.1186/s40536-023-00189-1","url":null,"abstract":"<p>This paper aims to showcase the value of implementing a Bayesian framework to analyze and report results from international large-scale assessments and provide guidance to users who want to analyse ILSA data using this approach. The motivation for this paper stems from the recognition that Bayesian statistical inference is fast becoming a popular methodological framework for the analysis of educational data generally, and large-scale assessments more specifically. The paper argues that Bayesian statistical methods can provide a more nuanced analysis of results of policy relevance compared to standard frequentist approaches commonly found in large-scale assessment reports. The data utilized for this paper comes from the Teaching and Learning International Survey (TALIS). The paper provides steps in implementing a Bayesian analysis and proposes a workflow that can be applied not only to TALIS but to large-scale assessments in general. The paper closes with a discussion of other Bayesian approaches to international large-scale assessment data, in particularly for predictive modeling.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"79 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139649008","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Does teachers’ motivation have an impact on students’ scientific literacy and motivation? An empirical study in Colombia with data from PISA 2015","authors":"Ana María Suárez-Mesa, Ricardo L. Gómez","doi":"10.1186/s40536-023-00190-8","DOIUrl":"https://doi.org/10.1186/s40536-023-00190-8","url":null,"abstract":"<p>In this study we use data from the Programme for International Student Assessment (PISA) to investigate the effect of teachers’ motivation on students’ scientific literacy and motivation in Colombia. These relationships are explored using a multilevel modeling framework and through the lens of Self-Determination Theory. Although difficulties in achieving education quality in developing and emerging economies are commonly attributed to teacher motivation issues, and important policy measures are implemented based on this assumption, this topic remains largely empirically unexplored. The purpose of the study is to contribute to fill this gap and provide empirically based insights for a broader and more informed dialogue regarding the effect of motivation in the development of scientific literacy, and to the design and implementation of evidence-based policies, instructional practices, and interventions. In this analysis, we did not find a significant relationship between teacher motivation and either students’ scientific literacy or motivation. However, students’ interest in science and sense of self-efficacy were significantly associated with their own achievement. The results also show that teacher-directed instruction is the strongest predictor of scientific literacy as opposed to inquiry-based teaching. However, inquiry-based teaching was found to be a positive predictor of increased students’ motivation.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"22 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2024-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139373840","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Alice Duggan, Anastasios Karakolidis, Aidan Clerkin, Lorraine Gilleece, Rachel Perkins
{"title":"Trends in educational inequalities in Ireland’s primary schools: an analysis based on TIMSS data (2011–2019)","authors":"Alice Duggan, Anastasios Karakolidis, Aidan Clerkin, Lorraine Gilleece, Rachel Perkins","doi":"10.1186/s40536-023-00188-2","DOIUrl":"https://doi.org/10.1186/s40536-023-00188-2","url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Background</h3><p>Socioeconomic characteristics are persistently and systematically related to academic outcomes, despite long-standing efforts to reduce educational inequality. Ireland has a strong policy focus on alleviating educational disadvantage and has seen significant improvements in mathematics and science performance in recent years. This study investigates patterns of socioeconomic inequalities in 4th grade students’ performance in mathematics and science between 2011 and 2019. Two measures of inequality are examined: (i) inequality of achievement, i.e., the degree of variability in student performance and (ii) inequality of opportunity, i.e., the extent to which student performance is related to background characteristics.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>Data for 4th-grade students in Ireland from TIMSS 2011, TIMSS 2015 and TIMSS 2019 were used. Mathematics and science achievement were the main outcome measures. The home resources for learning index was used as a proxy for student-level socioeconomic status. School-level socioeconomic status was examined according to schools’ participation in the <i>Delivering Equality of Opportunity in Schools</i> (DEIS) programme, which is the Department of Education’s main policy initiative addressing educational disadvantage. Descriptive and multilevel regression analyses were conducted to explore variability in student performance and to investigate the variance in achievement explained by socioeconomic factors, across cycles and subjects.</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>Between 2011 and 2015, between-student and between-school differences in mathematics and science performance became smaller, as shown by the decrease in standard deviations and the intraclass correlation coefficients (ICCs). This points to reduced inequality of achievement. Between 2015 and 2019, a small increase in inequality of achievement was observed.</p><p>Regarding inequality of opportunity, students’ home resources for learning and school disadvantaged status were statistically significantly related to mathematics and science achievement across all three cycles. Overall variance explained by these two variables increased from 2011 to 2019. This points towards increasing inequality of opportunity over the period examined.</p><p>Performance gaps between disadvantaged and non-disadvantaged schools have been reduced over time; however, the relationship between home resources for learning and achievement appears to have strengthened. Findings were consistent for both subjects.</p><h3 data-test=\"abstract-sub-heading\">Conclusions</h3><p>The findings indicate that improvements in overall performance do not necessarily reflect improved equality. Ireland’s improvements in average mathematics and science performance between 2011 and 2015 were accompanied by reduced inequality of achievement. Performance differences between disadvantaged and non-disadva","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"4 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-12-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138629613","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Factors related to students’ psychological distress during COVID-19 disruption across countries","authors":"Mojca Rožman, Yuan-Ling Liaw, Minge Chen","doi":"10.1186/s40536-023-00186-4","DOIUrl":"https://doi.org/10.1186/s40536-023-00186-4","url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Background</h3><p>The global outbreak of the novel COVID-19 virus presented a significant threat to students’ well-being across the globe. In this paper, we construct a measure of student psychological distress related to COVID-19 disruption. We then examine the variation in students’ psychological distress as a function of student demographic characteristics, home, school and student factors and compare the results across countries.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>We use item response theory to construct a comparable scale for students’ psychological distress across participating countries. Furthermore, we employ linear regression to explore the association of student characteristics and other student and school factors.</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>An internationally comparable scale for students’ psychological distress was constructed using the model assuming equal item parameters across countries. This enables us to compare the levels of students’ psychological distress and its relationships with the construct across countries. The most important factors contributing to students’ psychological distress were school support, school belonging, disrupted sleep, difficulties in learning after the disruption and preparedness for future disruptions. In some countries, we find suggestive evidence that boys exhibited lower psychological distress than girls. We do not find any meaningful relationship between home resources and the students’ psychological distress scale.</p><h3 data-test=\"abstract-sub-heading\">Conclusions</h3><p>Students across participating countries expressed negative feelings about schooling and events happening during the disruption and their effects on their future. We find indication that some school and student factors had a significant relationship with students’ psychological distress in many countries. This was especially the case in countries where remote learning took place during the disruption. In addition, differences across countries are found. The key finding is that high psychological distress is present in all countries studied around the world. However, it is important to note that the factors contributing to this distress are not the same everywhere. Therefore potential interventions must consider country specific factors.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"31 1","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138576681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Who are those random responders on your survey? The case of the TIMSS 2015 student questionnaire","authors":"Jianan Chen, Saskia van Laar, Johan Braeken","doi":"10.1186/s40536-023-00184-6","DOIUrl":"https://doi.org/10.1186/s40536-023-00184-6","url":null,"abstract":"<p>A general validity and survey quality concern with student questionnaires under low-stakes assessment conditions is that some responders will not genuinely engage with the questionnaire, often with more random response patterns as a result. Using a mixture IRT approach and a meta-analytic lens across 22 educational systems participating in TIMSS 2015, we investigated whether the prevalence of random responders on six scales regarding students’ engagement and attitudes toward mathematics and sciences was a function of grade, gender, socio-economic status, spoken language at home, or migration background. Among these common policy-relevant covariates in educational research, we found support for small group differences in prevalence of random responders (<span>({text {OR}}ge 1.22)</span>) (average prevalence of 7%). In general, being a student in grade 8 (vs. grade 4), being male, reporting to have fewer books, or speaking a language different from the test language at home were all risk factors characterizing random responders. The expected generalization and implications of these findings are discussed based on the observed heterogeneity across educational systems and consistency across questionnaire scales.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"12 6","pages":""},"PeriodicalIF":3.1,"publicationDate":"2023-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"138496856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Areum Han, Florian Krieger, Francesca Borgonovi, Samuel Greiff
{"title":"Behavioral patterns in collaborative problem solving: a latent profile analysis based on response times and actions in PISA 2015","authors":"Areum Han, Florian Krieger, Francesca Borgonovi, Samuel Greiff","doi":"10.1186/s40536-023-00185-5","DOIUrl":"https://doi.org/10.1186/s40536-023-00185-5","url":null,"abstract":"Abstract Process data are becoming more and more popular in education research. In the field of computer-based assessments of collaborative problem solving (ColPS), process data have been used to identify students’ test-taking strategies while working on the assessment, and such data can be used to complement data collected on accuracy and overall performance. Such information can be used to understand, for example, whether students are able to use a range of styles and strategies to solve different problems, given evidence that such cognitive flexibility may be important in labor markets and societies. In addition, process information might help researchers better identify the determinants of poor performance and interventions that can help students succeed. However, this line of research, particularly research that uses these data to profile students, is still in its infancy and has mostly been centered on small- to medium-scale collaboration settings between people (i.e., the human-to-human approach). There are only a few studies involving large-scale assessments of ColPS between a respondent and computer agents (i.e., the human-to-agent approach), where problem spaces are more standardized and fewer biases and confounds exist. In this study, we investigated students’ ColPS behavioral patterns using latent profile analyses (LPA) based on two types of process data (i.e., response times and the number of actions) collected from the Program for International Student Assessment (PISA) 2015 ColPS assessment, a large-scale international assessment of the human-to-agent approach. Analyses were conducted on test-takers who: (a) were administered the assessment in English and (b) were assigned the Xandar unit at the beginning of the test. The total sample size was N = 2,520. Analyses revealed two profiles (i.e., Profile 1 [95%] vs. Profile 2 [5%]) showing different behavioral characteristics across the four parts of the assessment unit. Significant differences were also found in overall performance between the profiles.","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"47 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"136346378","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}