PIAAC 扫盲中的性别差异:评估特征是否重要?

IF 2.6 Q1 EDUCATION & EDUCATIONAL RESEARCH
Ai Miyamoto, Britta Gauly, Anouk Zabal
{"title":"PIAAC 扫盲中的性别差异:评估特征是否重要?","authors":"Ai Miyamoto, Britta Gauly, Anouk Zabal","doi":"10.1186/s40536-024-00208-9","DOIUrl":null,"url":null,"abstract":"<h3 data-test=\"abstract-sub-heading\">Background</h3><p>Previous research based on large-scale studies consistently suggests that on average, male students tend to have lower literacy compared to their female students during secondary schooling. However, this gender gap in literacy seems to “disappear” during adulthood. Up until today, only a few studies investigated the role of assessment features in gender differences in literacy performance in adulthood. This study aims to understand the relationship between assessment features and gender differences in literacy skills.</p><h3 data-test=\"abstract-sub-heading\">Methods</h3><p>Using the German 2012 PIAAC data (N = 4,512), we applied item-level analyses using linear probability models to examine gender differences in the probability of solving a literacy item correctly with six assessment features including (1) text format, (2) text topics, (3) text length, (4) cognitive strategies, (5) numerical content of the text/questions, and (6) gender typicality of content.</p><h3 data-test=\"abstract-sub-heading\">Results</h3><p>We found that men had a 13.4% higher probability of solving items with a noncontinuous text format correctly than women. Men also had a 9.4% higher probability of solving short text items correctly and a 4.6% higher probability of solving items with a medium/high numerical content in the question correctly than women. There were small to negligible gender differences in literacy performance in terms of text topics, cognitive strategies, and gender typicality of content.</p><h3 data-test=\"abstract-sub-heading\">Conclusions</h3><p>Our findings highlight the role of text format, text length, and numerical content in gender differences in literacy skills, suggesting that further refining these practices can enhance the fairness and accuracy of literacy assessments. Specifically, we advocate for ongoing research aimed at understanding and minimizing the potential bias introduced by these assessment features. Such efforts are not only crucial for developing instruments that accurately measure literacy skills, but they also yield insights that hold significant implications for educational researchers and practitioners dedicated to creating more equitable assessment environments.</p>","PeriodicalId":37009,"journal":{"name":"Large-Scale Assessments in Education","volume":"1 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Gender differences in literacy in PIAAC: do assessment features matter?\",\"authors\":\"Ai Miyamoto, Britta Gauly, Anouk Zabal\",\"doi\":\"10.1186/s40536-024-00208-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<h3 data-test=\\\"abstract-sub-heading\\\">Background</h3><p>Previous research based on large-scale studies consistently suggests that on average, male students tend to have lower literacy compared to their female students during secondary schooling. However, this gender gap in literacy seems to “disappear” during adulthood. Up until today, only a few studies investigated the role of assessment features in gender differences in literacy performance in adulthood. This study aims to understand the relationship between assessment features and gender differences in literacy skills.</p><h3 data-test=\\\"abstract-sub-heading\\\">Methods</h3><p>Using the German 2012 PIAAC data (N = 4,512), we applied item-level analyses using linear probability models to examine gender differences in the probability of solving a literacy item correctly with six assessment features including (1) text format, (2) text topics, (3) text length, (4) cognitive strategies, (5) numerical content of the text/questions, and (6) gender typicality of content.</p><h3 data-test=\\\"abstract-sub-heading\\\">Results</h3><p>We found that men had a 13.4% higher probability of solving items with a noncontinuous text format correctly than women. Men also had a 9.4% higher probability of solving short text items correctly and a 4.6% higher probability of solving items with a medium/high numerical content in the question correctly than women. There were small to negligible gender differences in literacy performance in terms of text topics, cognitive strategies, and gender typicality of content.</p><h3 data-test=\\\"abstract-sub-heading\\\">Conclusions</h3><p>Our findings highlight the role of text format, text length, and numerical content in gender differences in literacy skills, suggesting that further refining these practices can enhance the fairness and accuracy of literacy assessments. Specifically, we advocate for ongoing research aimed at understanding and minimizing the potential bias introduced by these assessment features. Such efforts are not only crucial for developing instruments that accurately measure literacy skills, but they also yield insights that hold significant implications for educational researchers and practitioners dedicated to creating more equitable assessment environments.</p>\",\"PeriodicalId\":37009,\"journal\":{\"name\":\"Large-Scale Assessments in Education\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-07-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Large-Scale Assessments in Education\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1186/s40536-024-00208-9\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Large-Scale Assessments in Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s40536-024-00208-9","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

摘要

背景以往基于大规模研究的调查一致表明,平均而言,在中学阶段,男生的识字率往往低于女生。然而,这种识字率上的性别差距似乎在成年后 "消失 "了。迄今为止,只有少数研究调查了评估特征在成年后识字成绩性别差异中的作用。本研究旨在了解测评特征与识字能力性别差异之间的关系。方法利用德国2012年PIAAC数据(N = 4,512),我们采用线性概率模型进行了项目级分析,考察了在六种测评特征(包括:(1)文本格式;(2)文本主题;(3)文本长度;(4)认知策略;(5)文本/问题的数字内容;(6)内容的性别典型性)的作用下,识字项目正确解题概率的性别差异。结果我们发现,男性正确解答非连续文本格式题目的概率比女性高 13.4%。此外,男性正确解答短文题目的概率比女性高出 9.4%,正确解答中/高数字含量题目的概率比女性高出 4.6%。在文本主题、认知策略和内容的性别典型性方面,识字成绩的性别差异很小,甚至可以忽略不计。结论我们的研究结果突出了文本格式、文本长度和数字内容在识字技能性别差异中的作用,表明进一步完善这些做法可以提高识字评估的公平性和准确性。具体而言,我们主张持续开展研究,以了解并尽量减少这些评估特征可能带来的偏差。这些努力不仅对开发能准确测量读写能力的工具至关重要,而且它们所产生的见解对致力于创造更公平的评估环境的教育研究者和实践者也具有重要意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Gender differences in literacy in PIAAC: do assessment features matter?

Gender differences in literacy in PIAAC: do assessment features matter?

Background

Previous research based on large-scale studies consistently suggests that on average, male students tend to have lower literacy compared to their female students during secondary schooling. However, this gender gap in literacy seems to “disappear” during adulthood. Up until today, only a few studies investigated the role of assessment features in gender differences in literacy performance in adulthood. This study aims to understand the relationship between assessment features and gender differences in literacy skills.

Methods

Using the German 2012 PIAAC data (N = 4,512), we applied item-level analyses using linear probability models to examine gender differences in the probability of solving a literacy item correctly with six assessment features including (1) text format, (2) text topics, (3) text length, (4) cognitive strategies, (5) numerical content of the text/questions, and (6) gender typicality of content.

Results

We found that men had a 13.4% higher probability of solving items with a noncontinuous text format correctly than women. Men also had a 9.4% higher probability of solving short text items correctly and a 4.6% higher probability of solving items with a medium/high numerical content in the question correctly than women. There were small to negligible gender differences in literacy performance in terms of text topics, cognitive strategies, and gender typicality of content.

Conclusions

Our findings highlight the role of text format, text length, and numerical content in gender differences in literacy skills, suggesting that further refining these practices can enhance the fairness and accuracy of literacy assessments. Specifically, we advocate for ongoing research aimed at understanding and minimizing the potential bias introduced by these assessment features. Such efforts are not only crucial for developing instruments that accurately measure literacy skills, but they also yield insights that hold significant implications for educational researchers and practitioners dedicated to creating more equitable assessment environments.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Large-Scale Assessments in Education
Large-Scale Assessments in Education Social Sciences-Education
CiteScore
4.30
自引率
6.50%
发文量
16
审稿时长
13 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信