Qinwei Fan, Qian Kang, Jacek M Zurada, Tingwen Huang, Dongpo Xu
{"title":"Convergence Analysis of Online Gradient Method for High-Order Neural Networks and Their Sparse Optimization.","authors":"Qinwei Fan, Qian Kang, Jacek M Zurada, Tingwen Huang, Dongpo Xu","doi":"10.1109/TNNLS.2023.3319989","DOIUrl":null,"url":null,"abstract":"<p><p>In this article, we investigate the boundedness and convergence of the online gradient method with the smoothing group L<sub>1/2</sub> regularization for the sigma-pi-sigma neural network (SPSNN). This enhances the sparseness of the network and improves its generalization ability. For the original group L<sub>1/2</sub> regularization, the error function is nonconvex and nonsmooth, which can cause oscillation of the error function. To ameliorate this drawback, we propose a simple and effective smoothing technique, which can effectively eliminate the deficiency of the original group L<sub>1/2</sub> regularization. The group L<sub>1/2</sub> regularization effectively optimizes the network structure from two aspects redundant hidden nodes tending to zero and redundant weights of surviving hidden nodes in the network tending to zero. This article shows the strong and weak convergence results for the proposed method and proves the boundedness of weights. Experiment results clearly demonstrate the capability of the proposed method and the effectiveness of redundancy control. The simulation results are observed to support the theoretical results.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2023-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2023.3319989","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
In this article, we investigate the boundedness and convergence of the online gradient method with the smoothing group L1/2 regularization for the sigma-pi-sigma neural network (SPSNN). This enhances the sparseness of the network and improves its generalization ability. For the original group L1/2 regularization, the error function is nonconvex and nonsmooth, which can cause oscillation of the error function. To ameliorate this drawback, we propose a simple and effective smoothing technique, which can effectively eliminate the deficiency of the original group L1/2 regularization. The group L1/2 regularization effectively optimizes the network structure from two aspects redundant hidden nodes tending to zero and redundant weights of surviving hidden nodes in the network tending to zero. This article shows the strong and weak convergence results for the proposed method and proves the boundedness of weights. Experiment results clearly demonstrate the capability of the proposed method and the effectiveness of redundancy control. The simulation results are observed to support the theoretical results.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.