{"title":"在评价学生推理的同时揭示自己的推理:对科学工程教育形成性评价发展的启示","authors":"Mariana Orozco","doi":"10.1080/02602938.2023.2196008","DOIUrl":null,"url":null,"abstract":"When instructors assess students’ laboratory reports to appraise the underlying scientific reasoning, they disclose their own concerns, epis- temological assumptions and beliefs about science. The analysis of such assessments (i.e. rubric-centred scores and corresponding justificatory comments) offer a wealth of insights that can be re-engaged in further improvements of the assessment tool and procedure, and in develop- ments in formative assessment more generally. Such insights include concerns exceeding the rubric’s descriptions (about meaningfulness, exhaustiveness, implicitness, connectivity, true inquiry, relevance), while differences among assessors are exposed (regarding epistemic values, approaches to scoring, sensitivity). This contribution is part of a broader effort to promote students’ conducive scientific thinking and deep-learning in science and engineering education. It addresses the question(s): what does the assessors’ reasoning tell us about the ways in which formative assessment is conducted, and could ideally be? The empirical investigation connects to existing knowledge, and discusses issues of representativeness and granularity in formative assessment. The paper elaborates on the design and use of the assessment tool, and presents evidence supporting context-bound recommendations and general conclusions. It is proposed that developments in formative assessment will benefit from reconceptualisation of assessment criteria, as the result of a co-design activity that engages with the assessors’ epistemological concerns.","PeriodicalId":437516,"journal":{"name":"Assessment & Evaluation in Higher Education","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Disclosing own reasoning while appraising the students’ reasoning: implications for developments in formative assessment in science-engineering education\",\"authors\":\"Mariana Orozco\",\"doi\":\"10.1080/02602938.2023.2196008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"When instructors assess students’ laboratory reports to appraise the underlying scientific reasoning, they disclose their own concerns, epis- temological assumptions and beliefs about science. The analysis of such assessments (i.e. rubric-centred scores and corresponding justificatory comments) offer a wealth of insights that can be re-engaged in further improvements of the assessment tool and procedure, and in develop- ments in formative assessment more generally. Such insights include concerns exceeding the rubric’s descriptions (about meaningfulness, exhaustiveness, implicitness, connectivity, true inquiry, relevance), while differences among assessors are exposed (regarding epistemic values, approaches to scoring, sensitivity). This contribution is part of a broader effort to promote students’ conducive scientific thinking and deep-learning in science and engineering education. It addresses the question(s): what does the assessors’ reasoning tell us about the ways in which formative assessment is conducted, and could ideally be? The empirical investigation connects to existing knowledge, and discusses issues of representativeness and granularity in formative assessment. The paper elaborates on the design and use of the assessment tool, and presents evidence supporting context-bound recommendations and general conclusions. It is proposed that developments in formative assessment will benefit from reconceptualisation of assessment criteria, as the result of a co-design activity that engages with the assessors’ epistemological concerns.\",\"PeriodicalId\":437516,\"journal\":{\"name\":\"Assessment & Evaluation in Higher Education\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Assessment & Evaluation in Higher Education\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/02602938.2023.2196008\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessment & Evaluation in Higher Education","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/02602938.2023.2196008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Disclosing own reasoning while appraising the students’ reasoning: implications for developments in formative assessment in science-engineering education
When instructors assess students’ laboratory reports to appraise the underlying scientific reasoning, they disclose their own concerns, epis- temological assumptions and beliefs about science. The analysis of such assessments (i.e. rubric-centred scores and corresponding justificatory comments) offer a wealth of insights that can be re-engaged in further improvements of the assessment tool and procedure, and in develop- ments in formative assessment more generally. Such insights include concerns exceeding the rubric’s descriptions (about meaningfulness, exhaustiveness, implicitness, connectivity, true inquiry, relevance), while differences among assessors are exposed (regarding epistemic values, approaches to scoring, sensitivity). This contribution is part of a broader effort to promote students’ conducive scientific thinking and deep-learning in science and engineering education. It addresses the question(s): what does the assessors’ reasoning tell us about the ways in which formative assessment is conducted, and could ideally be? The empirical investigation connects to existing knowledge, and discusses issues of representativeness and granularity in formative assessment. The paper elaborates on the design and use of the assessment tool, and presents evidence supporting context-bound recommendations and general conclusions. It is proposed that developments in formative assessment will benefit from reconceptualisation of assessment criteria, as the result of a co-design activity that engages with the assessors’ epistemological concerns.