{"title":"Assessing the Impact of Feedback on Student Learning Using e2Logos: A Novel Grading Tool for Online Student Reports","authors":"P. Apostolellis, L. Wheeler, Lynn Mandeltort","doi":"10.1145/3587103.3594183","DOIUrl":null,"url":null,"abstract":"A common instructional approach to many CS and engineering classes involves designing a new software system, by providing real-world, open-ended, client-driven, team-based problems, most known as Model-Eliciting Activities (MEAs). A significant challenge imposed by this approach comes from accurately and consistently assessing student work where more than one solution can be correct. Therefore, timely feedback is pivotal for student success. Such feedback is fundamental in supporting grading consistency and efficiency for graders, but importantly to scaffold student understanding for student teams working on complex, ill-defined, real-world problems. This poster presents the next step in a two-phase evaluation (the first being a usability test) of a new grading and annotation tool for online technical reports, called e2Logos (evaluating electronic logos). We propose a plan for evaluating the educational impact of e2Logos in the context of an upper-level CS elective course on Human-Computer Interaction (HCI). The poster will also include a brief presentation of e2Logos, which aims to fill a gap in assessing and grading the rich type of student work submitted in project-based learning (PBL) courses employing MEAs.","PeriodicalId":366365,"journal":{"name":"Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 2","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 2","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3587103.3594183","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A common instructional approach to many CS and engineering classes involves designing a new software system, by providing real-world, open-ended, client-driven, team-based problems, most known as Model-Eliciting Activities (MEAs). A significant challenge imposed by this approach comes from accurately and consistently assessing student work where more than one solution can be correct. Therefore, timely feedback is pivotal for student success. Such feedback is fundamental in supporting grading consistency and efficiency for graders, but importantly to scaffold student understanding for student teams working on complex, ill-defined, real-world problems. This poster presents the next step in a two-phase evaluation (the first being a usability test) of a new grading and annotation tool for online technical reports, called e2Logos (evaluating electronic logos). We propose a plan for evaluating the educational impact of e2Logos in the context of an upper-level CS elective course on Human-Computer Interaction (HCI). The poster will also include a brief presentation of e2Logos, which aims to fill a gap in assessing and grading the rich type of student work submitted in project-based learning (PBL) courses employing MEAs.