{"title":"Investigating the Use and Effects of Feedback in CodingBat Exercises: An Exploratory Thinking Aloud Study","authors":"Natalie Kiesler","doi":"10.1109/IEEECONF56852.2023.10104622","DOIUrl":null,"url":null,"abstract":"The increasing availability of online tools helps support computing students via feedback to gain more practice in programming at their own pace. Due to the lack of educators’ insights into students’ independent practice with such online tools and their feedback options, this research aims at the evaluation of tutoring feedback types offered by the exemplary online tool CodingBat. In particular, students’ use of tutoring feedback types, as well as their effects on the cognitive, meta-cognitive and motivational level are investigated. The exploratory research methodology comprises a qualitative thinking aloud study with five novice learners of programming. The transcribed protocols were analyzed by using qualitative content analysis and deductive categories that originate from previous research on feedback effects and their observable and reportable indicators. The qualitative results reveal insights into students’ use of feedback, effects of feedback types on the cognitive, meta-cognitive and motivational level, as well as the importance of tutoring feedback including hints and a sample solution. The results of this applied, qualitative research add to the exploration of recommendations for the design of tutoring feedback in the context of self-paced online exercises for novice programmers. The findings further imply that automatically generated tutoring feedback seems to be helpful even without information that is adapted to the individual learner’s input.","PeriodicalId":445092,"journal":{"name":"2023 Future of Educational Innovation-Workshop Series Data in Action","volume":"273 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 Future of Educational Innovation-Workshop Series Data in Action","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IEEECONF56852.2023.10104622","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The increasing availability of online tools helps support computing students via feedback to gain more practice in programming at their own pace. Due to the lack of educators’ insights into students’ independent practice with such online tools and their feedback options, this research aims at the evaluation of tutoring feedback types offered by the exemplary online tool CodingBat. In particular, students’ use of tutoring feedback types, as well as their effects on the cognitive, meta-cognitive and motivational level are investigated. The exploratory research methodology comprises a qualitative thinking aloud study with five novice learners of programming. The transcribed protocols were analyzed by using qualitative content analysis and deductive categories that originate from previous research on feedback effects and their observable and reportable indicators. The qualitative results reveal insights into students’ use of feedback, effects of feedback types on the cognitive, meta-cognitive and motivational level, as well as the importance of tutoring feedback including hints and a sample solution. The results of this applied, qualitative research add to the exploration of recommendations for the design of tutoring feedback in the context of self-paced online exercises for novice programmers. The findings further imply that automatically generated tutoring feedback seems to be helpful even without information that is adapted to the individual learner’s input.