{"title":"Feedback format preferences of international post-graduate students in Australia: an exploratory mixed methods study","authors":"A. R. Sequeira, M. Bruce, M. Paull","doi":"10.1080/02602938.2023.2199955","DOIUrl":"https://doi.org/10.1080/02602938.2023.2199955","url":null,"abstract":"","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122544927","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"What makes a good PhD thesis? Norms of science as reflected in written assessments of PhD theses","authors":"S. Kobayashi, C. Emmeche","doi":"10.1080/02602938.2023.2200917","DOIUrl":"https://doi.org/10.1080/02602938.2023.2200917","url":null,"abstract":"Abstract This study looks at assessment of PhD theses from two perspectives: criteria in use in assessment reports at a science faculty and norms of science. Fifty assessment reports were analysed inductively, resulting in thirteen categories that examiners consider when assessing a thesis. These categories were compared with norms of science as described in the sociology of science. The study shows a high congruence between the two perspectives, but also new categories worthy of further discussion and research. Relevance of the thesis research and quality by proxy (that publication is an indicator of quality) stand out very clearly in this study compared with earlier assessment research. These two categories are both relatively new categories in assessment research and indicate that the classical norms of science are changing with an increasing influence of post-academic norms in academia.","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"69 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132738773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Students’ online evaluation of teaching and system continuance usage intention: new directions from a multidisciplinary perspective","authors":"Hui-Chih Wang, Rachael Ehianeta, H. Doong","doi":"10.1080/02602938.2023.2199181","DOIUrl":"https://doi.org/10.1080/02602938.2023.2199181","url":null,"abstract":"Abstract Student evaluation of teaching (SET), a major tool for assessing teaching quality in higher education is a crucial research topic. Among 13 studies published about online peer SET in Assessment & Evaluation in Higher Education over the past two decades, ease of use, clarity and helpfulness of SET information were repeatedly tested. This study introduces theory from the information systems field to give a multi-disciplinary view in testing how students’ perceptions may affect their intention to continue to use an online peer SET system. While past studies focused on ratemyprofessor.com, offering results from USA, Canada and UK, this study aimed to provide Asian insight by using data from Taiwan. Based on 364 student members of the selected website, findings indicated that disconfirmation of SET information significantly affected perceived usefulness, trust and satisfaction, ultimately shaping continuance usage intention of the online peer SET system. Practical implications for online peer SET website managers and institutional SET managers are discussed.","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114653995","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Daniel Sullivan, R. Lakeman, D. Massey, Dima Nasrawi, M. Tower, Megan Lee
{"title":"Student motivations, perceptions and opinions of participating in student evaluation of teaching surveys: a scoping review","authors":"Daniel Sullivan, R. Lakeman, D. Massey, Dima Nasrawi, M. Tower, Megan Lee","doi":"10.1080/02602938.2023.2199486","DOIUrl":"https://doi.org/10.1080/02602938.2023.2199486","url":null,"abstract":"","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130456092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Hutchinson, R. Coutts, D. Massey, Dima Nasrawi, Jann Fielden, Megan Lee, R. Lakeman
{"title":"Student evaluation of teaching: reactions of Australian academics to anonymous non-constructive student commentary","authors":"M. Hutchinson, R. Coutts, D. Massey, Dima Nasrawi, Jann Fielden, Megan Lee, R. Lakeman","doi":"10.1080/02602938.2023.2195598","DOIUrl":"https://doi.org/10.1080/02602938.2023.2195598","url":null,"abstract":"","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126147936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"New frontiers in student evaluations of teaching: university efforts to design and test a new instrument for student feedback","authors":"","doi":"10.1080/02602938.2023.2190060","DOIUrl":"https://doi.org/10.1080/02602938.2023.2190060","url":null,"abstract":"Abstract Student evaluations of teaching (SETs) are a ubiquitous feature of higher education. However, scholars have presented numerous challenges to the accuracy, validity, reliability and objectivity of SETs as a measure of teaching effectiveness. Given the potential for bias, the use of SETs in professional review may constitute a form of institutional discrimination. Therefore, institutions of higher learning need to develop, adopt and refine better methods for collecting and using student feedback. This paper describes the steps taken by a mid-sized comprehensive university in the USA over a three-year period to do that. We describe the work of our committee dealing with this issue, how we collaborated with the rest of the university to enact change, and the Learning Environment Survey (LENS) system that the university eventually selected and modified. We also report findings from a pilot study of the new instrument, which was favorably received by both students and faculty, and make recommendations for other institutions of higher education.","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125893525","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Disclosing own reasoning while appraising the students’ reasoning: implications for developments in formative assessment in science-engineering education","authors":"Mariana Orozco","doi":"10.1080/02602938.2023.2196008","DOIUrl":"https://doi.org/10.1080/02602938.2023.2196008","url":null,"abstract":"When instructors assess students’ laboratory reports to appraise the underlying scientific reasoning, they disclose their own concerns, epis- temological assumptions and beliefs about science. The analysis of such assessments (i.e. rubric-centred scores and corresponding justificatory comments) offer a wealth of insights that can be re-engaged in further improvements of the assessment tool and procedure, and in develop- ments in formative assessment more generally. Such insights include concerns exceeding the rubric’s descriptions (about meaningfulness, exhaustiveness, implicitness, connectivity, true inquiry, relevance), while differences among assessors are exposed (regarding epistemic values, approaches to scoring, sensitivity). This contribution is part of a broader effort to promote students’ conducive scientific thinking and deep-learning in science and engineering education. It addresses the question(s): what does the assessors’ reasoning tell us about the ways in which formative assessment is conducted, and could ideally be? The empirical investigation connects to existing knowledge, and discusses issues of representativeness and granularity in formative assessment. The paper elaborates on the design and use of the assessment tool, and presents evidence supporting context-bound recommendations and general conclusions. It is proposed that developments in formative assessment will benefit from reconceptualisation of assessment criteria, as the result of a co-design activity that engages with the assessors’ epistemological concerns.","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115449037","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Feedback literacy-as-event: relationality, space and temporality in feedback encounters","authors":"K. Gravett, D. Carless","doi":"10.1080/02602938.2023.2189162","DOIUrl":"https://doi.org/10.1080/02602938.2023.2189162","url":null,"abstract":"","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"114 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128135560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Self-regulating writers’ uses and non-uses of peer feedback","authors":"Natalie Usher","doi":"10.1080/02602938.2023.2179970","DOIUrl":"https://doi.org/10.1080/02602938.2023.2179970","url":null,"abstract":"","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129377215","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Defamiliarizing assessment and feedback: exploring the potential of ‘moments of engagement’ to throw light on the marking of undergraduate assignments","authors":"J. Tuck","doi":"10.1080/02602938.2023.2181942","DOIUrl":"https://doi.org/10.1080/02602938.2023.2181942","url":null,"abstract":"Assessors’ perspectives on their evaluative practices remain relatively under-researched. Given evidence that higher education assessment and feedback continue to be problematic, this paper proposes a specific methodological innovation with potential to contribute both to research and practice in this area. It explores the potential of a micro-analysis of textual engagement, nested within an ethnographic approach, to defamiliarize the often taken-for-granted practice of marking. The study on which the paper is based used screen capture combined with audio-recorded, concurrent talk-around-text to throw light on the processes, strategies and perspectives of eight teachers within one university as they assessed undergraduates’ work. This close-up focus was nested within broader ethnographic data generation incorporating interviews, marked assignments and other assessment-related texts. The paper presents selected ‘moments of engagement’ to show how this methodology can offer a renewed understanding of evaluative literacies as complex, ‘messy’ and shot through with influences invisible in the final assessed text but which may nevertheless be highly consequential. The paper concludes by reflecting on the potential for this type of data and analysis to contribute to assessor development and inform debate about the future of higher education assessment.","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125703799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}