Studies in Language Assessment最新文献

筛选
英文 中文
Anthony Green. Exploring Language Assessment and Testing: Language in Action 安东尼·格林。探索语言评估和测试:语言在行动
Studies in Language Assessment Pub Date : 2015-01-01 DOI: 10.58379/uzbz3766
S. Davidson
{"title":"Anthony Green. Exploring Language Assessment and Testing: Language in Action","authors":"S. Davidson","doi":"10.58379/uzbz3766","DOIUrl":"https://doi.org/10.58379/uzbz3766","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80999532","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Towards improved language assessment of written health professional communication: the case of the Occupational English Test 改善卫生专业书面沟通的语言评估:以职业英语测试为例
Studies in Language Assessment Pub Date : 2015-01-01 DOI: 10.58379/mfbr6523
U. Knoch, T. McNamara, R. Woodward‐Kron, C. Elder, E. Manias, E. Flynn, Ying Zhang
{"title":"Towards improved language assessment of written health professional communication: the case of the Occupational English Test","authors":"U. Knoch, T. McNamara, R. Woodward‐Kron, C. Elder, E. Manias, E. Flynn, Ying Zhang","doi":"10.58379/mfbr6523","DOIUrl":"https://doi.org/10.58379/mfbr6523","url":null,"abstract":"<jats:p>n/a</jats:p>","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90334909","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Using corpus complexity analysis to refine a holistic ESL writing placement rubric 使用语料库复杂性分析,以完善整体的ESL写作布局规则
Studies in Language Assessment Pub Date : 2015-01-01 DOI: 10.58379/evsp6047
J. R. Gevara
{"title":"Using corpus complexity analysis to refine a holistic ESL writing placement rubric","authors":"J. R. Gevara","doi":"10.58379/evsp6047","DOIUrl":"https://doi.org/10.58379/evsp6047","url":null,"abstract":"The purpose of this study is to determine if corpus analysis tools can identify linguistic features within writing placement samples that are significantly different between levels within a higher education language program. Although commercial tests are widely used for placement decisions, local performance assessments have become more common compliments that better adhere to communicative language teaching. At the university where this study was conducted, raters use a holistic rubric to score students’ responses to one academic topic. The scoring process is fast when rates agree but too time consuming when raters search for information to resolve disagreements. Writing placement samples from 123 former students’ essays at an Intensive English Program were used to compile a corpus. I divided the writing samples into four folders that correspond with the program levels and analyzed the folders using syntactic, lexical, and essay complexity analyzers. I utilized the robustness of the ANOVA to account for assumption violations. Data that violated the normality assumption were first analyzed using the Kruskal-Wallis Test. Those variables showing significant differences between levels were then analyzed using ANOVA and the appropriate post-hoc tests. Results show significant between group differences with lexical and word types and tokens, complex nominal, verb phrases, and ideas. I discuss the interpretation of these variables as well as show how administrators used this information to revise the rubric from Version I to Version II. Broader implications from this study are the use of corpus research tools to operationalize performance for the purposes of model building.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76242920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Development of a Spanish generic writing skills scale for the Colombian Graduate Skills Assessment (Saber Pro) 哥伦比亚研究生技能评估西班牙语通用写作技能量表的开发(Saber Pro)
Studies in Language Assessment Pub Date : 2015-01-01 DOI: 10.58379/dayb9070
A. Ducasse, Kathryn Hill
{"title":"Development of a Spanish generic writing skills scale for the Colombian Graduate Skills Assessment (Saber Pro)","authors":"A. Ducasse, Kathryn Hill","doi":"10.58379/dayb9070","DOIUrl":"https://doi.org/10.58379/dayb9070","url":null,"abstract":"While many higher education institutions list the generic skills their graduates are intended to acquire during a course of study (Barrie, 2006), the relevant skills are rarely directly assessed at graduation. In Colombia, exit assessment is compulsory for all post-secondary training and education. To this end, a Spanish-language version of the Australian Graduate Skills Assessment (GSA) was developed for the Colombian context. However, problems were identified with the reliability of the Spanish version of the GSA writing scale. This paper describes the process of replacing the original version of the Spanish-language version of the GSA scale (an intuitively based writing scale) with an empirically based scale developed using a question tree method. Forty raters constructed two holistic (combined trait) and three analytic (individual trait) writing scales using benchmarked scripts from a previous test administration. The five scales were then trialled. Comparison of the scales showed the eight-level holistic scale provided the widest distribution of scores. This research provides insights into generic writing skill testing for higher education graduates in Colombia. In addition, the study uniquely provides a detailed description of the development of empirically-based analytic and holistic scales for assessing the writing of Spanish-L1 speaking graduates in Colombia.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"82689566","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
International assessment and local contexts: A case study of an English language initiative in higher educatoin institutes in Egypt 国际评估和地方背景:埃及高等教育机构英语语言倡议的案例研究
Studies in Language Assessment Pub Date : 2015-01-01 DOI: 10.58379/xasg6414
H. Khalifa, Nahal Khabbazbashi, Samar Abdelsalam, Mohsen Elmahdy Said
{"title":"International assessment and local contexts: A case study of an English language initiative in higher educatoin institutes in Egypt","authors":"H. Khalifa, Nahal Khabbazbashi, Samar Abdelsalam, Mohsen Elmahdy Said","doi":"10.58379/xasg6414","DOIUrl":"https://doi.org/10.58379/xasg6414","url":null,"abstract":"Within the long-term objectives of English language reform in higher education (HE) institutes across Egypt and increasing employability in the global job market, the Center for Advancement of Postgraduate Studies and Research in Cairo University (CAPSCU), Cambridge English Language Assessment and the British Council (Egypt) have implemented a multi-phase upskilling program aimed at enhancing the workplace language skills of socially disadvantaged undergraduates, developing teachers’ pedagogical knowledge and application, providing both students and teachers with a competitive edge in the job markets through internationally recognised certification and the introduction of 21st century skills such as digital-age literacy and effective communication in HE, and, lastly, integrating international standards for teaching, learning and assessment within the local context. This paper reports on a mixed methods research study aimed at evaluating the effectiveness of this initiative and its impact at the micro and macro levels. The research focused on language progression, learner autonomy, motivation towards digital learning and assessment, improvements in pedagogical knowledge and teaching practices. Standardised assessment, attitudinal and perceptions surveys, and observational data were used. Findings suggested a positive impact of the upskilling program, illustrated how international collaborations can provide the necessary skills for today’s global job market, and highlighted areas for consideration for upscaling the initiative.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"83533737","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Using an English self-assessment tool to validate an English Placement Test 使用英语自我评估工具来验证英语分班考试
Studies in Language Assessment Pub Date : 2015-01-01 DOI: 10.58379/rphi9026
Zhi Li
{"title":"Using an English self-assessment tool to validate an English Placement Test","authors":"Zhi Li","doi":"10.58379/rphi9026","DOIUrl":"https://doi.org/10.58379/rphi9026","url":null,"abstract":"This study aimed to develop and use a contextualized self-assessment of English proficiency as a tool to validate an English Placement Test (MEPT) at a large Midwestern university in the U.S. More specifically, the self-assessment tool was expected to provide evidence for the extrapolation inference within an argument-based validity framework. 217 English as a second language (ESL) students participated in this study in the 2014 spring semester and 181 of them provided valid responses to the self-assessment. The results of a Rasch model-based item analysis indicated that the self-assessment items exhibited acceptable reliabilities and good item discrimination. There were no misfitting items in the self-assessment and the Likert scale used in the self-assessment functioned well. The results from confirmatory factor analysis indicated that a hypothesized correlated four-factor model fitted the self-assessment data. However, the multitrait-multimethod analyses revealed weak to moderate correlation coefficients between participants’ self-assessment and their performances on both the MEPT and the TOEFL iBT. Possible factors contributing to this relationship were discussed. Nonetheless, given the acceptable psychometric quality and a clear factor structure of the self-assessment, this could be a promising tool in providing evidence for the extrapolation inference of the placement test score interpretation and use.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2015-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90074762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
The use of semi-scripted speech in a listening placement test for university students 在大学生听力分班测试中使用半脚本演讲
Studies in Language Assessment Pub Date : 2014-01-01 DOI: 10.58379/qflf1241
M. Clark
{"title":"The use of semi-scripted speech in a listening placement test for university students","authors":"M. Clark","doi":"10.58379/qflf1241","DOIUrl":"https://doi.org/10.58379/qflf1241","url":null,"abstract":"This paper describes the feasibility of using semi-scripted spoken lectures as stimulus materials in a test of academic listening. The context for this study was the development of a revised test of academic listening designed to place enrolled university students into one of two levels of a language support course for non-native speakers. Because academic listening often involves listening to monologic speech such as lectures (Ferris & Tagg, 1996a), and because ‘authentic’ spoken language is qualitatively different to scripted speech (Biber et al., 2004), the revised test uses semi-scripted spoken mini-lectures as stimulus passages rather than relying on scripted material. Test questions were developed using only the informational elements that four model comprehenders, proficient English listeners (both native and non-native), were able to retain from a single hearing of the passages. Test data from 222 students were analysed using a Rasch methodology. Results show that this test development method did result in testable content that was appropriately targeted at the population of interest, though several aspects of the process could be improved. The paper concludes with some recommendations for using semi-scripted language in academic listening tests.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74418277","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 23
Determining aspects of text difficulty for the Sign Language of the Netherlands (NGT) Functional Assessment instrument 确定荷兰手语(NGT)功能评估工具的文本难度方面
Studies in Language Assessment Pub Date : 2014-01-01 DOI: 10.58379/qghs6327
Annieck van den Broek-Laven, E. Boers-Visker, B. V. D. Bogaerde, Hogeschool Utrecht
{"title":"Determining aspects of text difficulty for the Sign Language of the Netherlands (NGT) Functional Assessment instrument","authors":"Annieck van den Broek-Laven, E. Boers-Visker, B. V. D. Bogaerde, Hogeschool Utrecht","doi":"10.58379/qghs6327","DOIUrl":"https://doi.org/10.58379/qghs6327","url":null,"abstract":"In this paper we describe our work in progress on the development of a set of criteria to predict text difficulty in Sign Language of the Netherlands (NGT). These texts are used in a four year bachelor program, which is being brought in line with the Common European Framework of Reference for Languages (Council of Europe, 2001). Production and interaction proficiency are assessed through the NGT Functional Assessment instrument, adapted from the Sign Language Proficiency Interview (Caccamise & Samar, 2009). With this test we were able to determine that after one year of NGT-study students produce NGT at CEFR-level A2, after two years they sign at level B1, and after four years they are proficient in NGT on CEFR-level B2. As a result of that we were able to identify NGT texts that were matched to the level of students at certain stages in their studies with a CEFR-level. These texts were then analysed for sign familiarity, morpheme-sign rate, use of space and use of non-manual signals. All of these elements appear to be relevant for the determination of a good alignment between the difficulty of NGT signed texts and the targeted CEFR level, although only the morpheme-sign rate appears to be a decisive indicator.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86697436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
The effect of assessment of peer feedback on the quantity and quality of feedback given 评价同伴反馈对反馈数量和质量的影响
Studies in Language Assessment Pub Date : 2014-01-01 DOI: 10.58379/nmwl6229
Rachel Ruegg
{"title":"The effect of assessment of peer feedback on the quantity and quality of feedback given","authors":"Rachel Ruegg","doi":"10.58379/nmwl6229","DOIUrl":"https://doi.org/10.58379/nmwl6229","url":null,"abstract":"There has been a great deal of debate about the value of peer feedback in L2 writing classes. Different aspects of the way peer feedback is implemented have been found to contribute to its effectiveness. The purpose of the current study is to ascertain whether the assessment of feedback given by peers increases the quantity or quality of feedback given. The study investigated two intact classes at a Japanese university. Both groups used peer feedback on every preliminary draft for an entire year. One was assessed only on the final draft of each essay and the other on the feedback they gave to their peers in addition to the final drafts. The feedback given by students was analysed and compared between the two groups. It was found that the feedback-assessed group covered more points, wrote more comments, longer comments, more words overall, made more marks on partners’ drafts, and made more specific comments than the product-assessed group. However, no significant difference was found between the accuracy of feedback in the two groups. The results suggest that if instructors want peer readers to give more feedback and to give more specific feedback, the feedback given by students should be assessed.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73560257","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Development of a test of speaking proficiency in multiple languages 开发多种语言的口语能力测试
Studies in Language Assessment Pub Date : 2014-01-01 DOI: 10.58379/rtjy4508
India C. Plough
{"title":"Development of a test of speaking proficiency in multiple languages","authors":"India C. Plough","doi":"10.58379/rtjy4508","DOIUrl":"https://doi.org/10.58379/rtjy4508","url":null,"abstract":"The Residential College in the Arts and Humanities (RCAH) at Michigan State University has a foreign language proficiency graduation requirement. The RCAH has found it necessary to revise its language proficiency program and to develop a local test of language proficiency in lieu of using existing, internationally-recognised assessments of speaking proficiency. Situated within Critical Language Testing (Shohamy, 2001a, 2001b), the paper presents motivations for this decision reached after a yearlong program review. Treating the processes of teaching, learning, and assessment as interdependent, the RCAH’s new Cultures and Languages Across the Curriculum program and the new performance-based proficiency test are built on the same methodological principles. Grounded in a social interactional theory of second language acquisition and assessment, the RCAH Test employs a paired format and is intended to assess intermediate speaking proficiency in the more commonly taught and the less commonly taught languages. Initial trials have been conducted with native speakers of English, and native and non-native speakers of French, German, and Spanish. Using discourse analytic methods, preliminary analyses highlight the potential influence of sociocultural context and bring into question the importance of syntactic complexity in the conceptualisation of speaking proficiency.","PeriodicalId":29650,"journal":{"name":"Studies in Language Assessment","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2014-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76274413","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信