{"title":"基于正则化共轭梯度的稀疏自适应算法","authors":"R. Das","doi":"10.1109/SPCOM50965.2020.9179548","DOIUrl":null,"url":null,"abstract":"Adaptive algorithms in general yield slow convergence rate while identifying systems with colored input. In this context, the Adaptive Conjugate Gradient (ACG) algorithm shows fast convergence for colored input. However, the ACG algorithm do not exploit system sparsity. In this paper, the conjugate gradient based sparse adaptive algorithms are proposed. In particular, $\\ell_{1}$ and $\\ell_{0}$ norm penalties are added to the cost function of the ACG algorithm in order to attract the inactive taps to their optimum (i.e., zero) levels, and the resulting algorithms yield better steady-state performance. Simulation results show that the proposed algorithm outperforms recently proposed $\\ell_{0-}$ Recursive Least Square $(\\ell_{0^{-}}$RLS) algorithm.","PeriodicalId":208527,"journal":{"name":"2020 International Conference on Signal Processing and Communications (SPCOM)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ℓ1/ℓ0 Regularized Conjugate Gradient Based Sparse Adaptive Algorithms\",\"authors\":\"R. Das\",\"doi\":\"10.1109/SPCOM50965.2020.9179548\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Adaptive algorithms in general yield slow convergence rate while identifying systems with colored input. In this context, the Adaptive Conjugate Gradient (ACG) algorithm shows fast convergence for colored input. However, the ACG algorithm do not exploit system sparsity. In this paper, the conjugate gradient based sparse adaptive algorithms are proposed. In particular, $\\\\ell_{1}$ and $\\\\ell_{0}$ norm penalties are added to the cost function of the ACG algorithm in order to attract the inactive taps to their optimum (i.e., zero) levels, and the resulting algorithms yield better steady-state performance. Simulation results show that the proposed algorithm outperforms recently proposed $\\\\ell_{0-}$ Recursive Least Square $(\\\\ell_{0^{-}}$RLS) algorithm.\",\"PeriodicalId\":208527,\"journal\":{\"name\":\"2020 International Conference on Signal Processing and Communications (SPCOM)\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 International Conference on Signal Processing and Communications (SPCOM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SPCOM50965.2020.9179548\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Signal Processing and Communications (SPCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPCOM50965.2020.9179548","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
ℓ1/ℓ0 Regularized Conjugate Gradient Based Sparse Adaptive Algorithms
Adaptive algorithms in general yield slow convergence rate while identifying systems with colored input. In this context, the Adaptive Conjugate Gradient (ACG) algorithm shows fast convergence for colored input. However, the ACG algorithm do not exploit system sparsity. In this paper, the conjugate gradient based sparse adaptive algorithms are proposed. In particular, $\ell_{1}$ and $\ell_{0}$ norm penalties are added to the cost function of the ACG algorithm in order to attract the inactive taps to their optimum (i.e., zero) levels, and the resulting algorithms yield better steady-state performance. Simulation results show that the proposed algorithm outperforms recently proposed $\ell_{0-}$ Recursive Least Square $(\ell_{0^{-}}$RLS) algorithm.