{"title":"基于群Lasso正则化的模糊神经网络特征选择","authors":"Tao Gao, Xiao Bai, Liang Zhang, Jian Wang","doi":"10.1109/SSCI50451.2021.9659548","DOIUrl":null,"url":null,"abstract":"In this paper, a Group Lasso penalty based em-bedded/integrated feature selection method for multiple-input and multiple-output (MIMO) Takagi-Sugeno (TS) fuzzy neural network (FNN) is proposed. Group Lasso regularization can produce sparsity on the widths of the modified Gaussian membership function and this can guide us to select the useful features. Compared with Lasso, Group Lasso formulation has a Group penalty to the set of widths (weights) connected to a particular feature. To address the non-differentiability of the Group Lasso term, a smoothing Group Lasso method is introduced. Finally, one benchmark classification problem and two regression problems are used to validate the effectiveness of the proposed method.","PeriodicalId":255763,"journal":{"name":"2021 IEEE Symposium Series on Computational Intelligence (SSCI)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Feature Selection for Fuzzy Neural Networks using Group Lasso Regularization\",\"authors\":\"Tao Gao, Xiao Bai, Liang Zhang, Jian Wang\",\"doi\":\"10.1109/SSCI50451.2021.9659548\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, a Group Lasso penalty based em-bedded/integrated feature selection method for multiple-input and multiple-output (MIMO) Takagi-Sugeno (TS) fuzzy neural network (FNN) is proposed. Group Lasso regularization can produce sparsity on the widths of the modified Gaussian membership function and this can guide us to select the useful features. Compared with Lasso, Group Lasso formulation has a Group penalty to the set of widths (weights) connected to a particular feature. To address the non-differentiability of the Group Lasso term, a smoothing Group Lasso method is introduced. Finally, one benchmark classification problem and two regression problems are used to validate the effectiveness of the proposed method.\",\"PeriodicalId\":255763,\"journal\":{\"name\":\"2021 IEEE Symposium Series on Computational Intelligence (SSCI)\",\"volume\":\"34 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE Symposium Series on Computational Intelligence (SSCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SSCI50451.2021.9659548\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Symposium Series on Computational Intelligence (SSCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SSCI50451.2021.9659548","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Feature Selection for Fuzzy Neural Networks using Group Lasso Regularization
In this paper, a Group Lasso penalty based em-bedded/integrated feature selection method for multiple-input and multiple-output (MIMO) Takagi-Sugeno (TS) fuzzy neural network (FNN) is proposed. Group Lasso regularization can produce sparsity on the widths of the modified Gaussian membership function and this can guide us to select the useful features. Compared with Lasso, Group Lasso formulation has a Group penalty to the set of widths (weights) connected to a particular feature. To address the non-differentiability of the Group Lasso term, a smoothing Group Lasso method is introduced. Finally, one benchmark classification problem and two regression problems are used to validate the effectiveness of the proposed method.