{"title":"Creating authentic assessment in mathematics","authors":"Ria Symonds, Lisa Mott","doi":"10.21100/msor.v22i2.1484","DOIUrl":null,"url":null,"abstract":"Assessment of students’ mathematics knowledge within higher education (HE) has normally taken a very traditional approach. Closed-book assessments have long been the favoured mode of assessment (Iannone Simpson, 2011) which often requires students to recall facts, formulae, and methods. One could argue that this type of assessment is limited in its ability to effectively assess how well a student’s ability to authentically use mathematics has developed. Due to the recent pandemic, many institutions were forced to rethink their assessment methods so that they could be delivered online and remotely. As such, there has been a renewed sense of need for more ‘authentic’ assessments for mathematics-based programmes.In this paper, we will discuss our journey of creating more authentic assessments for apprentices enrolled on a new Data Science Degree Apprenticeship, particularly in mathematics/statistics. We will compare two years of delivery of the course; the first year of delivery which comprised of traditional assessment methods (coursework/exam) and the second year of delivery that used more authentic assessment methods. We will discuss the pros and cons of each model by reflecting on our practice and drawing on apprentices’ feedback.","PeriodicalId":18932,"journal":{"name":"MSOR connections","volume":"15 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"MSOR connections","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21100/msor.v22i2.1484","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Assessment of students’ mathematics knowledge within higher education (HE) has normally taken a very traditional approach. Closed-book assessments have long been the favoured mode of assessment (Iannone Simpson, 2011) which often requires students to recall facts, formulae, and methods. One could argue that this type of assessment is limited in its ability to effectively assess how well a student’s ability to authentically use mathematics has developed. Due to the recent pandemic, many institutions were forced to rethink their assessment methods so that they could be delivered online and remotely. As such, there has been a renewed sense of need for more ‘authentic’ assessments for mathematics-based programmes.In this paper, we will discuss our journey of creating more authentic assessments for apprentices enrolled on a new Data Science Degree Apprenticeship, particularly in mathematics/statistics. We will compare two years of delivery of the course; the first year of delivery which comprised of traditional assessment methods (coursework/exam) and the second year of delivery that used more authentic assessment methods. We will discuss the pros and cons of each model by reflecting on our practice and drawing on apprentices’ feedback.