Dewant Katare;David Solans Noguero;Souneil Park;Nicolas Kourtellis;Marijn Janssen;Aaron Yi Ding
{"title":"通过处理数据集中的类别不平衡分析和减轻弱势道路使用者的偏见","authors":"Dewant Katare;David Solans Noguero;Souneil Park;Nicolas Kourtellis;Marijn Janssen;Aaron Yi Ding","doi":"10.1109/OJITS.2025.3564558","DOIUrl":null,"url":null,"abstract":"Vulnerable road users (VRUs), including pedestrians, cyclists, and motorcyclists, account for approximately 50% of road traffic fatalities globally, as per the World Health Organization. In these scenarios, the accuracy and fairness of perception applications used in autonomous driving become critical to reduce such risks. For machine learning models, performing object classification and detection tasks, the focus has been on improving accuracy and enhancing model performance metrics; however, issues such as biases inherited in models, statistical imbalances and disparities within the datasets are often overlooked. Our research addresses these issues by exploring class imbalances among vulnerable road users by focusing on class distribution analysis, evaluating model performance, and bias impact assessment. Using popular CNN models and Vision Transformers (ViTs) with the nuScenes dataset, our performance evaluation shows detection disparities for underrepresented classes. Compared to related work, we focus on metric-specific and cost-sensitive learning for model optimization and bias mitigation, which includes data augmentation and resampling. Using the proposed mitigation approaches, we see improvement in IoU(%) and NDS(%) metrics from 71.3 to 75.6 and 80.6 to 83.7 for the CNN model. Similarly, for ViT, we observe improvement in IoU and NDS metrics from 74.9 to 79.2 and 83.8 to 87.1. This research contributes to developing reliable models while addressing inclusiveness for minority classes in datasets. Code can be accessed at: BiasDet.","PeriodicalId":100631,"journal":{"name":"IEEE Open Journal of Intelligent Transportation Systems","volume":"6 ","pages":"590-604"},"PeriodicalIF":4.6000,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10977047","citationCount":"0","resultStr":"{\"title\":\"Analyzing and Mitigating Bias for Vulnerable Road Users by Addressing Class Imbalance in Datasets\",\"authors\":\"Dewant Katare;David Solans Noguero;Souneil Park;Nicolas Kourtellis;Marijn Janssen;Aaron Yi Ding\",\"doi\":\"10.1109/OJITS.2025.3564558\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Vulnerable road users (VRUs), including pedestrians, cyclists, and motorcyclists, account for approximately 50% of road traffic fatalities globally, as per the World Health Organization. In these scenarios, the accuracy and fairness of perception applications used in autonomous driving become critical to reduce such risks. For machine learning models, performing object classification and detection tasks, the focus has been on improving accuracy and enhancing model performance metrics; however, issues such as biases inherited in models, statistical imbalances and disparities within the datasets are often overlooked. Our research addresses these issues by exploring class imbalances among vulnerable road users by focusing on class distribution analysis, evaluating model performance, and bias impact assessment. Using popular CNN models and Vision Transformers (ViTs) with the nuScenes dataset, our performance evaluation shows detection disparities for underrepresented classes. Compared to related work, we focus on metric-specific and cost-sensitive learning for model optimization and bias mitigation, which includes data augmentation and resampling. Using the proposed mitigation approaches, we see improvement in IoU(%) and NDS(%) metrics from 71.3 to 75.6 and 80.6 to 83.7 for the CNN model. Similarly, for ViT, we observe improvement in IoU and NDS metrics from 74.9 to 79.2 and 83.8 to 87.1. This research contributes to developing reliable models while addressing inclusiveness for minority classes in datasets. Code can be accessed at: BiasDet.\",\"PeriodicalId\":100631,\"journal\":{\"name\":\"IEEE Open Journal of Intelligent Transportation Systems\",\"volume\":\"6 \",\"pages\":\"590-604\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2025-04-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10977047\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Open Journal of Intelligent Transportation Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10977047/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of Intelligent Transportation Systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10977047/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Analyzing and Mitigating Bias for Vulnerable Road Users by Addressing Class Imbalance in Datasets
Vulnerable road users (VRUs), including pedestrians, cyclists, and motorcyclists, account for approximately 50% of road traffic fatalities globally, as per the World Health Organization. In these scenarios, the accuracy and fairness of perception applications used in autonomous driving become critical to reduce such risks. For machine learning models, performing object classification and detection tasks, the focus has been on improving accuracy and enhancing model performance metrics; however, issues such as biases inherited in models, statistical imbalances and disparities within the datasets are often overlooked. Our research addresses these issues by exploring class imbalances among vulnerable road users by focusing on class distribution analysis, evaluating model performance, and bias impact assessment. Using popular CNN models and Vision Transformers (ViTs) with the nuScenes dataset, our performance evaluation shows detection disparities for underrepresented classes. Compared to related work, we focus on metric-specific and cost-sensitive learning for model optimization and bias mitigation, which includes data augmentation and resampling. Using the proposed mitigation approaches, we see improvement in IoU(%) and NDS(%) metrics from 71.3 to 75.6 and 80.6 to 83.7 for the CNN model. Similarly, for ViT, we observe improvement in IoU and NDS metrics from 74.9 to 79.2 and 83.8 to 87.1. This research contributes to developing reliable models while addressing inclusiveness for minority classes in datasets. Code can be accessed at: BiasDet.