{"title":"Batch gradient neuro-fuzzy learning method with smoothing L0 regularization for the first-order Takagi-Sugeno system","authors":"Qingqing Ma, Chunmei Qi, Huisheng Zhang","doi":"10.1145/3514105.3514108","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a batch gradient neuro-fuzzy learning algorithm with smoothing regularization (BGNFSL0) for the first-order Takagi-Sugeno system. The regularization method usually tends to produce the sparsest solution, however, its solving is an NP-hard problem, and it cannot be directly used in designing the regularized gradient neuro-fuzzy learning method. By exploiting a series of smoothing functions to approximate the regularizer, the proposed BGNFSL0 successfully avoids the NP-hard nature of the original regularization method, while inheriting the advantage in producing the sparsest solution. In this way, BGNFSL0 can prune the network efficiently during the learning procedure and thus improve the generalization capability of the networks. By conducting simulations to compare it with several other popular regularization learning methods, it is found that BGNFSL0 exhibits the best performance in generating the parsimonious network as well as the generalization capability.","PeriodicalId":360718,"journal":{"name":"Proceedings of the 2022 9th International Conference on Wireless Communication and Sensor Networks","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 9th International Conference on Wireless Communication and Sensor Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3514105.3514108","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we propose a batch gradient neuro-fuzzy learning algorithm with smoothing regularization (BGNFSL0) for the first-order Takagi-Sugeno system. The regularization method usually tends to produce the sparsest solution, however, its solving is an NP-hard problem, and it cannot be directly used in designing the regularized gradient neuro-fuzzy learning method. By exploiting a series of smoothing functions to approximate the regularizer, the proposed BGNFSL0 successfully avoids the NP-hard nature of the original regularization method, while inheriting the advantage in producing the sparsest solution. In this way, BGNFSL0 can prune the network efficiently during the learning procedure and thus improve the generalization capability of the networks. By conducting simulations to compare it with several other popular regularization learning methods, it is found that BGNFSL0 exhibits the best performance in generating the parsimonious network as well as the generalization capability.