Validation of two measures for assessing English vocabulary knowledge on web-based testing platforms: long-form assessments

IF 1.1 2区 文学 0 LANGUAGE & LINGUISTICS
Lee Drown, Nikole Giovannone, David B. Pisoni, Rachel M. Theodore
{"title":"Validation of two measures for assessing English vocabulary knowledge on web-based testing platforms: long-form assessments","authors":"Lee Drown, Nikole Giovannone, David B. Pisoni, Rachel M. Theodore","doi":"10.1515/lingvan-2022-0115","DOIUrl":null,"url":null,"abstract":"Abstract The goal of the current work was to develop and validate web-based measures for assessing English vocabulary knowledge. Two existing paper-and-pencil assessments, the Vocabulary Size Test (VST) and the Word Familiarity Test (WordFAM), were modified for web-based administration. In Experiment 1, participants ( n = 100) completed the web-based VST. In Experiment 2, participants ( n = 100) completed the web-based WordFAM. Results from these experiments confirmed that both tasks (1) could be completed online, (2) showed expected sensitivity to English frequency patterns, (3) exhibited high internal consistency, and (4) showed an expected range of item discrimination scores, with low frequency items exhibiting higher item discrimination scores compared to high frequency items. This work provides open-source English vocabulary knowledge assessments with normative data that researchers can use to foster high quality data collection in web-based environments.","PeriodicalId":55960,"journal":{"name":"Linguistics Vanguard","volume":"36 1","pages":"0"},"PeriodicalIF":1.1000,"publicationDate":"2023-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Linguistics Vanguard","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/lingvan-2022-0115","RegionNum":2,"RegionCategory":"文学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"LANGUAGE & LINGUISTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract The goal of the current work was to develop and validate web-based measures for assessing English vocabulary knowledge. Two existing paper-and-pencil assessments, the Vocabulary Size Test (VST) and the Word Familiarity Test (WordFAM), were modified for web-based administration. In Experiment 1, participants ( n = 100) completed the web-based VST. In Experiment 2, participants ( n = 100) completed the web-based WordFAM. Results from these experiments confirmed that both tasks (1) could be completed online, (2) showed expected sensitivity to English frequency patterns, (3) exhibited high internal consistency, and (4) showed an expected range of item discrimination scores, with low frequency items exhibiting higher item discrimination scores compared to high frequency items. This work provides open-source English vocabulary knowledge assessments with normative data that researchers can use to foster high quality data collection in web-based environments.
基于网络测试平台的两种英语词汇知识评估方法的验证:长篇评估
当前工作的目标是开发和验证基于网络的英语词汇知识评估方法。现有的两项纸笔评估,即词汇量测试(VST)和单词熟悉度测试(WordFAM),已被修改为基于网络的管理。在实验1中,参与者(n = 100)完成了基于网络的VST。在实验2中,参与者(n = 100)完成了基于web的WordFAM。实验结果证实:(1)两个任务都可以在线完成,(2)对英语频率模式表现出预期的敏感性,(3)表现出较高的内部一致性,(4)表现出预期的项目识别分数范围,低频项目比高频项目表现出更高的项目识别分数。这项工作提供了开源的英语词汇知识评估和规范性数据,研究人员可以使用这些数据在基于网络的环境中促进高质量的数据收集。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
2.00
自引率
18.20%
发文量
105
期刊介绍: Linguistics Vanguard is a new channel for high quality articles and innovative approaches in all major fields of linguistics. This multimodal journal is published solely online and provides an accessible platform supporting both traditional and new kinds of publications. Linguistics Vanguard seeks to publish concise and up-to-date reports on the state of the art in linguistics as well as cutting-edge research papers. With its topical breadth of coverage and anticipated quick rate of production, it is one of the leading platforms for scientific exchange in linguistics. Its broad theoretical range, international scope, and diversity of article formats engage students and scholars alike. All topics within linguistics are welcome. The journal especially encourages submissions taking advantage of its new multimodal platform designed to integrate interactive content, including audio and video, images, maps, software code, raw data, and any other media that enhances the traditional written word. The novel platform and concise article format allows for rapid turnaround of submissions. Full peer review assures quality and enables authors to receive appropriate credit for their work. The journal publishes general submissions as well as special collections. Ideas for special collections may be submitted to the editors for consideration.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信