{"title":"Debugging for Inclusivity in Online CS Courseware: Does it Work?","authors":"Amreeta Chatterjee, Rudrajit Choudhuri, Mrinmoy Sarkar, Soumiki Chattopadhyay, Dylan Liu, Samarendra Hedaoo, Margaret Burnett, Anita Sarma","doi":"10.1145/3632620.3671117","DOIUrl":null,"url":null,"abstract":"Online computer science (CS) courses have broadened access to CS education, yet inclusivity barriers persist for minoritized groups in these courses. One problem that recent research has shown is that often inclusivity biases (“inclusivity bugs”) lurk within the course materials themselves, disproportionately disadvantaging minori-tized students. To address this issue, we investigated how a faculty member can use AID—an Automated Inclusivity Detector tool—to remove such inclusivity bugs from a large online CS1 (Intro CS) course and what is the impact of the resulting inclusivity fixes on the students’ experiences. To enable this evaluation, we first needed to (Bugs): investigate inclusivity challenges students face in 5 online CS courses; (Build): build decision rules to capture these challenges in courseware (“inclusivity bugs”) and implement them in the AID tool; (Faculty): investigate how the faculty member followed up on the inclusivity bugs that AID reported; and (Students): investigate how the faculty member’s changes impacted students’ experiences via a before-vs-after qualitative study with CS students. Our results from (Bugs) revealed 39 inclusivity challenges spanning courseware components from the syllabus to assignments. After implementing the rules in the tool (Build), our results from (Faculty) revealed how the faculty member treated AID more as a “peer” than an authority in deciding whether and how to fix the bugs. Finally, the study results with (Students) revealed that students found the after-fix courseware more approachable - feeling less overwhelmed and more in control in contrast to the before-fix version where they constantly felt overwhelmed, often seeking external assistance to understand course content.","PeriodicalId":245617,"journal":{"name":"International Computing Education Research Workshop","volume":"12 5","pages":"419-433"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Computing Education Research Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3632620.3671117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Online computer science (CS) courses have broadened access to CS education, yet inclusivity barriers persist for minoritized groups in these courses. One problem that recent research has shown is that often inclusivity biases (“inclusivity bugs”) lurk within the course materials themselves, disproportionately disadvantaging minori-tized students. To address this issue, we investigated how a faculty member can use AID—an Automated Inclusivity Detector tool—to remove such inclusivity bugs from a large online CS1 (Intro CS) course and what is the impact of the resulting inclusivity fixes on the students’ experiences. To enable this evaluation, we first needed to (Bugs): investigate inclusivity challenges students face in 5 online CS courses; (Build): build decision rules to capture these challenges in courseware (“inclusivity bugs”) and implement them in the AID tool; (Faculty): investigate how the faculty member followed up on the inclusivity bugs that AID reported; and (Students): investigate how the faculty member’s changes impacted students’ experiences via a before-vs-after qualitative study with CS students. Our results from (Bugs) revealed 39 inclusivity challenges spanning courseware components from the syllabus to assignments. After implementing the rules in the tool (Build), our results from (Faculty) revealed how the faculty member treated AID more as a “peer” than an authority in deciding whether and how to fix the bugs. Finally, the study results with (Students) revealed that students found the after-fix courseware more approachable - feeling less overwhelmed and more in control in contrast to the before-fix version where they constantly felt overwhelmed, often seeking external assistance to understand course content.