{"title":"Eliciting Course Feedback through a Bug Bounty Program","authors":"A. Kapoor, A. Penton, H. Pierpont","doi":"10.1145/3502717.3532159","DOIUrl":null,"url":null,"abstract":"In this paper, we present a bug bounty program that can aid instructors in systematically gathering formative feedback for iterative refinement of their course content. We describe the logistics of implementing the program, explain the types of content in which bugs are reported, elaborate on how students received the program, and evaluate if the program can be effective in improving the course quality. We present data from a large undergraduate Data Structures and Algorithms (DSA) course that was offered consecutively for four semesters. In total, 898 students enrolled in our course, 200 students reported at least one bug, and 373 bugs were reported in total related to incorrect or ambiguous content in instructional material, logistical errors such as broken links, and bugs in short programming problems such as less exhaustive testing. We found that a majority of the students who participated reported a single bug. We also found that the normalized number of bugs reported per student gradually decreased across semesters to almost one-half after two iterations (0.53 bugs reported/student in the first two semesters vs 0.28 bugs reported/student in the last two). This suggests that the program can be effectively used to iteratively refine the course content and improve the learner experience. Students received the program enthusiastically with 97% showing positive or neutral valence on the continuation of the program in future course offerings.","PeriodicalId":274484,"journal":{"name":"Proceedings of the 27th ACM Conference on on Innovation and Technology in Computer Science Education Vol. 2","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 27th ACM Conference on on Innovation and Technology in Computer Science Education Vol. 2","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3502717.3532159","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In this paper, we present a bug bounty program that can aid instructors in systematically gathering formative feedback for iterative refinement of their course content. We describe the logistics of implementing the program, explain the types of content in which bugs are reported, elaborate on how students received the program, and evaluate if the program can be effective in improving the course quality. We present data from a large undergraduate Data Structures and Algorithms (DSA) course that was offered consecutively for four semesters. In total, 898 students enrolled in our course, 200 students reported at least one bug, and 373 bugs were reported in total related to incorrect or ambiguous content in instructional material, logistical errors such as broken links, and bugs in short programming problems such as less exhaustive testing. We found that a majority of the students who participated reported a single bug. We also found that the normalized number of bugs reported per student gradually decreased across semesters to almost one-half after two iterations (0.53 bugs reported/student in the first two semesters vs 0.28 bugs reported/student in the last two). This suggests that the program can be effectively used to iteratively refine the course content and improve the learner experience. Students received the program enthusiastically with 97% showing positive or neutral valence on the continuation of the program in future course offerings.