{"title":"评估 2022 年国际学生评估项目(PISA)中学生在线信息调查问卷做法的测量不变性:使用 MGCFA 和排列法的比较研究","authors":"Esra Sözer Boz","doi":"10.1007/s10639-024-12921-7","DOIUrl":null,"url":null,"abstract":"<p>International large-scale assessments provide cross-national data on students’ cognitive and non-cognitive characteristics. A critical methodological issue that often arises in comparing data from cross-national studies is ensuring measurement invariance, indicating that the construct under investigation is the same across the compared groups. This study addresses the measurement invariance of students’ practices regarding online information (ICTINFO) questionnaire across countries in the PISA 2022 cycle. Some methodological complexities have arisen when testing the measurement invariance across the presence of many groups. For testing measurement invariance, the multiple group confirmatory factor analysis (MGCFA), which is a traditional procedure, was employed first, and then a novel approach, the alignment method, was performed. This study comprised 29 OECD countries, with a total sample size of 187.614 15-year-old students. The MGCFA results revealed that metric invariance was achieved across countries, indicating comparable factor loadings while not the same for factor means. Consistent with MGCFA results, the alignment method identified noninvariant parameters exceeding the 25% cut-off criteria across countries. Monte Carlo simulation validated the reliability of the alignment results. This study contributes to international assessments by providing a detailed examination of measurement invariance and comparing the findings from various methodologies for improving assessment accuracy. The results provide evidence-based recommendations for policymakers to ensure fair and equitable evaluations of student performance across different countries, thereby contributing to more reliable and valid international assessments.</p>","PeriodicalId":51494,"journal":{"name":"Education and Information Technologies","volume":null,"pages":null},"PeriodicalIF":4.8000,"publicationDate":"2024-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Evaluating measurement invariance of students’ practices regarding online information questionnaire in PISA 2022: a comparative study using MGCFA and alignment method\",\"authors\":\"Esra Sözer Boz\",\"doi\":\"10.1007/s10639-024-12921-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>International large-scale assessments provide cross-national data on students’ cognitive and non-cognitive characteristics. A critical methodological issue that often arises in comparing data from cross-national studies is ensuring measurement invariance, indicating that the construct under investigation is the same across the compared groups. This study addresses the measurement invariance of students’ practices regarding online information (ICTINFO) questionnaire across countries in the PISA 2022 cycle. Some methodological complexities have arisen when testing the measurement invariance across the presence of many groups. For testing measurement invariance, the multiple group confirmatory factor analysis (MGCFA), which is a traditional procedure, was employed first, and then a novel approach, the alignment method, was performed. This study comprised 29 OECD countries, with a total sample size of 187.614 15-year-old students. The MGCFA results revealed that metric invariance was achieved across countries, indicating comparable factor loadings while not the same for factor means. Consistent with MGCFA results, the alignment method identified noninvariant parameters exceeding the 25% cut-off criteria across countries. Monte Carlo simulation validated the reliability of the alignment results. This study contributes to international assessments by providing a detailed examination of measurement invariance and comparing the findings from various methodologies for improving assessment accuracy. The results provide evidence-based recommendations for policymakers to ensure fair and equitable evaluations of student performance across different countries, thereby contributing to more reliable and valid international assessments.</p>\",\"PeriodicalId\":51494,\"journal\":{\"name\":\"Education and Information Technologies\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.8000,\"publicationDate\":\"2024-07-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Education and Information Technologies\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1007/s10639-024-12921-7\",\"RegionNum\":2,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Education and Information Technologies","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1007/s10639-024-12921-7","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
Evaluating measurement invariance of students’ practices regarding online information questionnaire in PISA 2022: a comparative study using MGCFA and alignment method
International large-scale assessments provide cross-national data on students’ cognitive and non-cognitive characteristics. A critical methodological issue that often arises in comparing data from cross-national studies is ensuring measurement invariance, indicating that the construct under investigation is the same across the compared groups. This study addresses the measurement invariance of students’ practices regarding online information (ICTINFO) questionnaire across countries in the PISA 2022 cycle. Some methodological complexities have arisen when testing the measurement invariance across the presence of many groups. For testing measurement invariance, the multiple group confirmatory factor analysis (MGCFA), which is a traditional procedure, was employed first, and then a novel approach, the alignment method, was performed. This study comprised 29 OECD countries, with a total sample size of 187.614 15-year-old students. The MGCFA results revealed that metric invariance was achieved across countries, indicating comparable factor loadings while not the same for factor means. Consistent with MGCFA results, the alignment method identified noninvariant parameters exceeding the 25% cut-off criteria across countries. Monte Carlo simulation validated the reliability of the alignment results. This study contributes to international assessments by providing a detailed examination of measurement invariance and comparing the findings from various methodologies for improving assessment accuracy. The results provide evidence-based recommendations for policymakers to ensure fair and equitable evaluations of student performance across different countries, thereby contributing to more reliable and valid international assessments.
期刊介绍:
The Journal of Education and Information Technologies (EAIT) is a platform for the range of debates and issues in the field of Computing Education as well as the many uses of information and communication technology (ICT) across many educational subjects and sectors. It probes the use of computing to improve education and learning in a variety of settings, platforms and environments.
The journal aims to provide perspectives at all levels, from the micro level of specific pedagogical approaches in Computing Education and applications or instances of use in classrooms, to macro concerns of national policies and major projects; from pre-school classes to adults in tertiary institutions; from teachers and administrators to researchers and designers; from institutions to online and lifelong learning. The journal is embedded in the research and practice of professionals within the contemporary global context and its breadth and scope encourage debate on fundamental issues at all levels and from different research paradigms and learning theories. The journal does not proselytize on behalf of the technologies (whether they be mobile, desktop, interactive, virtual, games-based or learning management systems) but rather provokes debate on all the complex relationships within and between computing and education, whether they are in informal or formal settings. It probes state of the art technologies in Computing Education and it also considers the design and evaluation of digital educational artefacts. The journal aims to maintain and expand its international standing by careful selection on merit of the papers submitted, thus providing a credible ongoing forum for debate and scholarly discourse. Special Issues are occasionally published to cover particular issues in depth. EAIT invites readers to submit papers that draw inferences, probe theory and create new knowledge that informs practice, policy and scholarship. Readers are also invited to comment and reflect upon the argument and opinions published. EAIT is the official journal of the Technical Committee on Education of the International Federation for Information Processing (IFIP) in partnership with UNESCO.