{"title":"Hybrid oversampling technique for imbalanced pattern recognition: Enhancing performance with Borderline Synthetic Minority oversampling and Generative Adversarial Networks","authors":"Md Manjurul Ahsan , Shivakumar Raman , Yingtao Liu , Zahed Siddique","doi":"10.1016/j.mlwa.2025.100637","DOIUrl":null,"url":null,"abstract":"<div><div>Class imbalance problems (CIP) are one of the potential challenges in developing unbiased Machine Learning models for predictions. CIP occurs when data samples are not equally distributed between two or multiple classes. Several synthetic oversampling techniques have been introduced to balance the imbalanced data by oversampling the minor samples. One of the potential drawbacks of existing oversampling techniques is that they often fail to focus on the data samples that lie at the border point and give more attention to the extreme observations, ultimately limiting the creation of more diverse data after oversampling, and that is almost the scenario for most of the oversampling strategies. As an effect, marginalization occurs after oversampling. To address these issues, in this work, we propose a hybrid oversampling technique, named Borderline Synthetic Minority Oversampling and Generative Adversarial Network (BSGAN), by combining the strengths of Borderline-Synthetic Minority Oversampling Technique (SMOTE) and Generative Adversarial Networks (GANs). This approach aims to generate more diverse data that follow Gaussian distributions, marking a significant contribution to the field of Artificial Intelligence. We tested BSGAN on ten highly imbalanced datasets, demonstrating its application in engineering, where it outperformed existing oversampling techniques, creating a more diverse dataset that follows a normal distribution after oversampling.</div></div>","PeriodicalId":74093,"journal":{"name":"Machine learning with applications","volume":"20 ","pages":"Article 100637"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine learning with applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666827025000209","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Class imbalance problems (CIP) are one of the potential challenges in developing unbiased Machine Learning models for predictions. CIP occurs when data samples are not equally distributed between two or multiple classes. Several synthetic oversampling techniques have been introduced to balance the imbalanced data by oversampling the minor samples. One of the potential drawbacks of existing oversampling techniques is that they often fail to focus on the data samples that lie at the border point and give more attention to the extreme observations, ultimately limiting the creation of more diverse data after oversampling, and that is almost the scenario for most of the oversampling strategies. As an effect, marginalization occurs after oversampling. To address these issues, in this work, we propose a hybrid oversampling technique, named Borderline Synthetic Minority Oversampling and Generative Adversarial Network (BSGAN), by combining the strengths of Borderline-Synthetic Minority Oversampling Technique (SMOTE) and Generative Adversarial Networks (GANs). This approach aims to generate more diverse data that follow Gaussian distributions, marking a significant contribution to the field of Artificial Intelligence. We tested BSGAN on ten highly imbalanced datasets, demonstrating its application in engineering, where it outperformed existing oversampling techniques, creating a more diverse dataset that follows a normal distribution after oversampling.