{"title":"基于群平方根弹性网和迭代多元阈值算法的群稀疏恢复","authors":"Wanling Xie, Hu Yang","doi":"10.1007/s10182-022-00443-x","DOIUrl":null,"url":null,"abstract":"<div><p>In this work, we propose a novel group selection method called Group Square-Root Elastic Net. It is based on square-root regularization with a group elastic net penalty, i.e., a <span>\\(\\ell _{2,1}+\\ell _2\\)</span> penalty. As a type of square-root-based procedure, one distinct feature is that the estimator is independent of the unknown noise level <span>\\(\\sigma \\)</span>, which is non-trivial to estimate under the high-dimensional setting, especially when <span>\\(p\\gg n\\)</span>. In many applications, the estimator is expected to be sparse, not in an irregular way, but rather in a structured manner. It makes the proposed method very attractive to tackle both high-dimensionality and structured sparsity. We study the correct subset recovery under a Group Elastic Net Irrepresentable Condition. Both the slow rate bounds and fast rate bounds are established, the latter under the Restricted Eigenvalue assumption and Gaussian noise assumption. To implement, a fast algorithm based on the scaled multivariate thresholding-based iterative selection idea is introduced with proved convergence. A comparative study examines the superiority of our approach against alternatives.</p></div>","PeriodicalId":55446,"journal":{"name":"Asta-Advances in Statistical Analysis","volume":"107 3","pages":"469 - 507"},"PeriodicalIF":1.4000,"publicationDate":"2022-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm\",\"authors\":\"Wanling Xie, Hu Yang\",\"doi\":\"10.1007/s10182-022-00443-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In this work, we propose a novel group selection method called Group Square-Root Elastic Net. It is based on square-root regularization with a group elastic net penalty, i.e., a <span>\\\\(\\\\ell _{2,1}+\\\\ell _2\\\\)</span> penalty. As a type of square-root-based procedure, one distinct feature is that the estimator is independent of the unknown noise level <span>\\\\(\\\\sigma \\\\)</span>, which is non-trivial to estimate under the high-dimensional setting, especially when <span>\\\\(p\\\\gg n\\\\)</span>. In many applications, the estimator is expected to be sparse, not in an irregular way, but rather in a structured manner. It makes the proposed method very attractive to tackle both high-dimensionality and structured sparsity. We study the correct subset recovery under a Group Elastic Net Irrepresentable Condition. Both the slow rate bounds and fast rate bounds are established, the latter under the Restricted Eigenvalue assumption and Gaussian noise assumption. To implement, a fast algorithm based on the scaled multivariate thresholding-based iterative selection idea is introduced with proved convergence. A comparative study examines the superiority of our approach against alternatives.</p></div>\",\"PeriodicalId\":55446,\"journal\":{\"name\":\"Asta-Advances in Statistical Analysis\",\"volume\":\"107 3\",\"pages\":\"469 - 507\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2022-04-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Asta-Advances in Statistical Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10182-022-00443-x\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Asta-Advances in Statistical Analysis","FirstCategoryId":"100","ListUrlMain":"https://link.springer.com/article/10.1007/s10182-022-00443-x","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm
In this work, we propose a novel group selection method called Group Square-Root Elastic Net. It is based on square-root regularization with a group elastic net penalty, i.e., a \(\ell _{2,1}+\ell _2\) penalty. As a type of square-root-based procedure, one distinct feature is that the estimator is independent of the unknown noise level \(\sigma \), which is non-trivial to estimate under the high-dimensional setting, especially when \(p\gg n\). In many applications, the estimator is expected to be sparse, not in an irregular way, but rather in a structured manner. It makes the proposed method very attractive to tackle both high-dimensionality and structured sparsity. We study the correct subset recovery under a Group Elastic Net Irrepresentable Condition. Both the slow rate bounds and fast rate bounds are established, the latter under the Restricted Eigenvalue assumption and Gaussian noise assumption. To implement, a fast algorithm based on the scaled multivariate thresholding-based iterative selection idea is introduced with proved convergence. A comparative study examines the superiority of our approach against alternatives.
期刊介绍:
AStA - Advances in Statistical Analysis, a journal of the German Statistical Society, is published quarterly and presents original contributions on statistical methods and applications and review articles.