{"title":"Assuring online assessment quality: the case of unproctored online assessment","authors":"L. Lin, D. Foung, Julia Chen","doi":"10.1108/qae-02-2022-0048","DOIUrl":null,"url":null,"abstract":"\nPurpose\nThis study aims to examine the impact of the transformation of an assessment on students’ performance and perspectives in an English for Academic Purposes course in Hong Kong. The assessment was changed from the traditional pen-and-paper mode to an unproctored online mode.\n\n\nDesign/methodology/approach\nUsing mixed methods, the research team analysed the differences between the assessment performances of those who took the course before the pandemic (n = 664) and those who took it during the pandemic (n = 702). Furthermore, focus group interviews were conducted with seven students regarding their perspectives on the unproctored assessment.\n\n\nFindings\nThe results revealed that, although there were no major differences in the overall grades of the two groups, students who were assessed online during the pandemic performed significantly better in terms of their English use. Nevertheless, the shift to online assessment had several negative effects on the students.\n\n\nOriginality/value\nPrevious studies on unproctored online assessments (UOA) were concerned with potential learning quality issues, such as plagiarism and grade inflation. This study, however, provided empirical evidence that high-quality assessment delivery can be provided via UOA if the question types and assessment arrangements are carefully decided.\n","PeriodicalId":46734,"journal":{"name":"QUALITY ASSURANCE IN EDUCATION","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2022-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"QUALITY ASSURANCE IN EDUCATION","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/qae-02-2022-0048","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 1
Abstract
Purpose
This study aims to examine the impact of the transformation of an assessment on students’ performance and perspectives in an English for Academic Purposes course in Hong Kong. The assessment was changed from the traditional pen-and-paper mode to an unproctored online mode.
Design/methodology/approach
Using mixed methods, the research team analysed the differences between the assessment performances of those who took the course before the pandemic (n = 664) and those who took it during the pandemic (n = 702). Furthermore, focus group interviews were conducted with seven students regarding their perspectives on the unproctored assessment.
Findings
The results revealed that, although there were no major differences in the overall grades of the two groups, students who were assessed online during the pandemic performed significantly better in terms of their English use. Nevertheless, the shift to online assessment had several negative effects on the students.
Originality/value
Previous studies on unproctored online assessments (UOA) were concerned with potential learning quality issues, such as plagiarism and grade inflation. This study, however, provided empirical evidence that high-quality assessment delivery can be provided via UOA if the question types and assessment arrangements are carefully decided.
期刊介绍:
QAE publishes original empirical or theoretical articles on Quality Assurance issues, including dimensions and indicators of Quality and Quality Improvement, as applicable to education at all levels, including pre-primary, primary, secondary, higher and professional education. Periodically, QAE also publishes systematic reviews, research syntheses and assessment policy articles on topics of current significance. As an international journal, QAE seeks submissions on topics that have global relevance. Article submissions could pertain to the following areas integral to QAE''s mission: -organizational or program development, change and improvement -educational testing or assessment programs -evaluation of educational innovations, programs and projects -school efficiency assessments -standards, reforms, accountability, accreditation, and audits in education -tools, criteria and methods for examining or assuring quality