{"title":"Assessing public speaking","authors":"Noel R. Jackson, A. Ward","doi":"10.1109/ITHET.2014.7155700","DOIUrl":null,"url":null,"abstract":"Assessing public speaking is an increasingly normal part of a University or Higher Education establishment's role in the attempt to embed useful transferable skills in students ready for their entry into the workplace. This activity can take an increasing amount of time out of teaching given that students require a reasonable opportunity to show what they are capable of. Approaches to assessment of ability during a presentation vary from hand written notes on a paper based guide through tick sheets to electronically based versions and spreadsheets. Processes may vary between institutions thus the issue of whether it should be purely academics making the assessment, purely students assessing their peers or a mixture of both is probably decided within the associated assessment regulations. There is also of course the need to consider weightings, marks and categories that could vary with the subject matter all of which can be very confusing and more subjective to the marker than allowing for objectivity in the marking process. The purpose of this study is to trial the production and use of a `common' rubric to assess individual student's presentations during a one year taught masters (MSc) in Engineering Management at the University of York. One of the key research outputs will be to indicate whether such an approach, and this specific rubric in particular, is effective in reducing assessment workload for staff and thus improving the student experience in terms of timeliness of feedback. This paper will describe the production and trialing of the rubric before discussing the findings and their implications.","PeriodicalId":432693,"journal":{"name":"2014 Information Technology Based Higher Education and Training (ITHET)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 Information Technology Based Higher Education and Training (ITHET)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITHET.2014.7155700","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Assessing public speaking is an increasingly normal part of a University or Higher Education establishment's role in the attempt to embed useful transferable skills in students ready for their entry into the workplace. This activity can take an increasing amount of time out of teaching given that students require a reasonable opportunity to show what they are capable of. Approaches to assessment of ability during a presentation vary from hand written notes on a paper based guide through tick sheets to electronically based versions and spreadsheets. Processes may vary between institutions thus the issue of whether it should be purely academics making the assessment, purely students assessing their peers or a mixture of both is probably decided within the associated assessment regulations. There is also of course the need to consider weightings, marks and categories that could vary with the subject matter all of which can be very confusing and more subjective to the marker than allowing for objectivity in the marking process. The purpose of this study is to trial the production and use of a `common' rubric to assess individual student's presentations during a one year taught masters (MSc) in Engineering Management at the University of York. One of the key research outputs will be to indicate whether such an approach, and this specific rubric in particular, is effective in reducing assessment workload for staff and thus improving the student experience in terms of timeliness of feedback. This paper will describe the production and trialing of the rubric before discussing the findings and their implications.