Real-world data often exhibit asymmetric class distributions, where certain target values have significantly fewer observations compared to the others. This lack of uniform distribution across categories can substantially affect model performance in classification problems. This research introduces the performance-based active learning (PbAL) scheme to address the class imbalance problem considering the nonlinear decision boundary. PbAL is designed to sequentially select the most beneficial samples from an imbalanced data set by directly evaluating a performance metric on a pool of data. While parametric logistic regression offers a fundamental classification model with ease of interpretation, the assumption of linear relationship in the logit function is often questionable. The use of nonparametric logistic regression with smoothing splines allows for a more flexible classification boundary. Experiments with several data sets demonstrate that PbAL often outperforms traditional active learning approaches based on D-optimality and A-optimality. Additionally, the proposed method yields superior results compared to other resampling techniques commonly used for imbalanced classification problems even with a smaller sample size. These findings suggest that PbAL effectively mitigates bias caused by training on imbalanced classes, which can severely impact model’s ability to accurately predict class labels for new observations.