{"title":"Intuitionistic fuzzy broad learning system with a new non-membership function","authors":"Mengying Jiang, Huisheng Zhang, Yuxuan Liu","doi":"10.1007/s00521-024-10328-6","DOIUrl":null,"url":null,"abstract":"<p>Data containing noises, outliers, and imbalanced class distributions pose challenges to the traditional classifiers. By incorporating both the membership and non-membership functions, the intuitionistic fuzzy (IF) set has shown potential in designing robust learning algorithms for classifiers. However, the non-membership function used in these IF-based classifiers usually only utilizes the local distribution information of the training samples, and the classifiers are built upon single-hidden layer networks, which degrade the performance of the corresponding classifiers. Broad learning system (BLS) is an emerging neural network model with fast learning speed and flexible network architecture; however, it still fails to distinguish n samples. To this end, in this paper, we propose a new definition of the non-membership function within intuitionistic fuzzy sets and subsequently propose an intuitionistic fuzzy broad learning system (IFBLS) model. The proposed non-membership function incorporates two ratio numbers based on four distances, allowing for the utilization of global information on the distribution of samples and mitigating misclassification of valid samples as noise which is often observed in traditional methods. By using a score function that considers both the membership and non-membership functions to redistribute the importance of the training samples, the proposed IFBLS benefits from both the powerful representation capability of the original BLS and the robust learning of IF-based models. Extensive experiments conducted on 21 imbalanced binary classification problems sourced from the UCI and KEEL repositories illustrate that the proposed IFBLS achieves state-of-the-art performance by attaining the highest testing accuracy in 17 out of the 21 problems.</p>","PeriodicalId":18925,"journal":{"name":"Neural Computing and Applications","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computing and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00521-024-10328-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Data containing noises, outliers, and imbalanced class distributions pose challenges to the traditional classifiers. By incorporating both the membership and non-membership functions, the intuitionistic fuzzy (IF) set has shown potential in designing robust learning algorithms for classifiers. However, the non-membership function used in these IF-based classifiers usually only utilizes the local distribution information of the training samples, and the classifiers are built upon single-hidden layer networks, which degrade the performance of the corresponding classifiers. Broad learning system (BLS) is an emerging neural network model with fast learning speed and flexible network architecture; however, it still fails to distinguish n samples. To this end, in this paper, we propose a new definition of the non-membership function within intuitionistic fuzzy sets and subsequently propose an intuitionistic fuzzy broad learning system (IFBLS) model. The proposed non-membership function incorporates two ratio numbers based on four distances, allowing for the utilization of global information on the distribution of samples and mitigating misclassification of valid samples as noise which is often observed in traditional methods. By using a score function that considers both the membership and non-membership functions to redistribute the importance of the training samples, the proposed IFBLS benefits from both the powerful representation capability of the original BLS and the robust learning of IF-based models. Extensive experiments conducted on 21 imbalanced binary classification problems sourced from the UCI and KEEL repositories illustrate that the proposed IFBLS achieves state-of-the-art performance by attaining the highest testing accuracy in 17 out of the 21 problems.