一阶Takagi-Sugeno系统平滑L0正则化的批处理梯度神经模糊学习方法

Qingqing Ma, Chunmei Qi, Huisheng Zhang
{"title":"一阶Takagi-Sugeno系统平滑L0正则化的批处理梯度神经模糊学习方法","authors":"Qingqing Ma, Chunmei Qi, Huisheng Zhang","doi":"10.1145/3514105.3514108","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a batch gradient neuro-fuzzy learning algorithm with smoothing regularization (BGNFSL0) for the first-order Takagi-Sugeno system. The regularization method usually tends to produce the sparsest solution, however, its solving is an NP-hard problem, and it cannot be directly used in designing the regularized gradient neuro-fuzzy learning method. By exploiting a series of smoothing functions to approximate the regularizer, the proposed BGNFSL0 successfully avoids the NP-hard nature of the original regularization method, while inheriting the advantage in producing the sparsest solution. In this way, BGNFSL0 can prune the network efficiently during the learning procedure and thus improve the generalization capability of the networks. By conducting simulations to compare it with several other popular regularization learning methods, it is found that BGNFSL0 exhibits the best performance in generating the parsimonious network as well as the generalization capability.","PeriodicalId":360718,"journal":{"name":"Proceedings of the 2022 9th International Conference on Wireless Communication and Sensor Networks","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Batch gradient neuro-fuzzy learning method with smoothing L0 regularization for the first-order Takagi-Sugeno system\",\"authors\":\"Qingqing Ma, Chunmei Qi, Huisheng Zhang\",\"doi\":\"10.1145/3514105.3514108\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we propose a batch gradient neuro-fuzzy learning algorithm with smoothing regularization (BGNFSL0) for the first-order Takagi-Sugeno system. The regularization method usually tends to produce the sparsest solution, however, its solving is an NP-hard problem, and it cannot be directly used in designing the regularized gradient neuro-fuzzy learning method. By exploiting a series of smoothing functions to approximate the regularizer, the proposed BGNFSL0 successfully avoids the NP-hard nature of the original regularization method, while inheriting the advantage in producing the sparsest solution. In this way, BGNFSL0 can prune the network efficiently during the learning procedure and thus improve the generalization capability of the networks. By conducting simulations to compare it with several other popular regularization learning methods, it is found that BGNFSL0 exhibits the best performance in generating the parsimonious network as well as the generalization capability.\",\"PeriodicalId\":360718,\"journal\":{\"name\":\"Proceedings of the 2022 9th International Conference on Wireless Communication and Sensor Networks\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-01-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2022 9th International Conference on Wireless Communication and Sensor Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3514105.3514108\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 9th International Conference on Wireless Communication and Sensor Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3514105.3514108","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本文针对一阶Takagi-Sugeno系统,提出了一种带平滑正则化的批处理梯度神经模糊学习算法(BGNFSL0)。正则化方法通常倾向于产生最稀疏解,但其求解是np困难问题,不能直接用于设计正则化梯度神经模糊学习方法。通过利用一系列平滑函数来逼近正则化器,BGNFSL0成功地避免了原正则化方法的NP-hard性质,同时继承了产生最稀疏解的优势。这样,BGNFSL0可以在学习过程中有效地对网络进行修剪,从而提高网络的泛化能力。通过与其他几种常用的正则化学习方法进行仿真比较,发现BGNFSL0在生成简约网络和泛化能力方面表现出最好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Batch gradient neuro-fuzzy learning method with smoothing L0 regularization for the first-order Takagi-Sugeno system
In this paper, we propose a batch gradient neuro-fuzzy learning algorithm with smoothing regularization (BGNFSL0) for the first-order Takagi-Sugeno system. The regularization method usually tends to produce the sparsest solution, however, its solving is an NP-hard problem, and it cannot be directly used in designing the regularized gradient neuro-fuzzy learning method. By exploiting a series of smoothing functions to approximate the regularizer, the proposed BGNFSL0 successfully avoids the NP-hard nature of the original regularization method, while inheriting the advantage in producing the sparsest solution. In this way, BGNFSL0 can prune the network efficiently during the learning procedure and thus improve the generalization capability of the networks. By conducting simulations to compare it with several other popular regularization learning methods, it is found that BGNFSL0 exhibits the best performance in generating the parsimonious network as well as the generalization capability.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信