鲁棒融合套索的加速近端交替方向乘法器方法

Yibao Fan, Youlin Shang, Zheng-Fen Jin, Jia Liu, Roxin Zhang
{"title":"鲁棒融合套索的加速近端交替方向乘法器方法","authors":"Yibao Fan, Youlin Shang, Zheng-Fen Jin, Jia Liu, Roxin Zhang","doi":"10.1051/ro/2023065","DOIUrl":null,"url":null,"abstract":"In the era of big data, much of the data is susceptible to noise with heavy-tailed distribution. Fused Lasso can effectively handle high dimensional sparse data with strong correlation between two adjacent variables under known Gaussian noise. However, it has poor robustness to nonGaussian noise with heavy-tailed distribution. Robust fused Lasso with l1 norm loss function can overcome the drawback of fused Lasso when noise is heavy-tailed distribution. But the key challenge for solving this model is nonsmoothness and its nonseparability. Therefore, in this paper, we first deform the robust fused Lasso into an easily solvable form, which changes the three-block objective function to a two-block form. Then, we propose an accelerated proximal alternating direction method of multipliers (APADMM) with an additional update step, which is base on a new PADMM that changes the Lagrangian multiplier term update. Furthermore, we give the O(1/K) nonergodic convergence rate analysis of the proposed APADMM. Finally, numerical results show that the proposed new PADMM and APADMM have better performance than other existing ADMM solvers.","PeriodicalId":20872,"journal":{"name":"RAIRO Oper. Res.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An accelerated proximal alternating direction method of multipliers for robust fused Lasso\",\"authors\":\"Yibao Fan, Youlin Shang, Zheng-Fen Jin, Jia Liu, Roxin Zhang\",\"doi\":\"10.1051/ro/2023065\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In the era of big data, much of the data is susceptible to noise with heavy-tailed distribution. Fused Lasso can effectively handle high dimensional sparse data with strong correlation between two adjacent variables under known Gaussian noise. However, it has poor robustness to nonGaussian noise with heavy-tailed distribution. Robust fused Lasso with l1 norm loss function can overcome the drawback of fused Lasso when noise is heavy-tailed distribution. But the key challenge for solving this model is nonsmoothness and its nonseparability. Therefore, in this paper, we first deform the robust fused Lasso into an easily solvable form, which changes the three-block objective function to a two-block form. Then, we propose an accelerated proximal alternating direction method of multipliers (APADMM) with an additional update step, which is base on a new PADMM that changes the Lagrangian multiplier term update. Furthermore, we give the O(1/K) nonergodic convergence rate analysis of the proposed APADMM. Finally, numerical results show that the proposed new PADMM and APADMM have better performance than other existing ADMM solvers.\",\"PeriodicalId\":20872,\"journal\":{\"name\":\"RAIRO Oper. Res.\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"RAIRO Oper. Res.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1051/ro/2023065\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"RAIRO Oper. Res.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1051/ro/2023065","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在大数据时代,很多数据容易受到重尾分布的噪声的影响。在已知高斯噪声的情况下,融合Lasso可以有效地处理高维稀疏数据,且相邻变量之间具有较强的相关性。但该方法对重尾分布的非高斯噪声的鲁棒性较差。具有l1范数损失函数的鲁棒融合Lasso克服了融合Lasso在噪声为重尾分布时的缺点。但求解该模型的关键问题是其非光滑性和不可分离性。因此,本文首先将鲁棒融合Lasso变形为易于求解的形式,将三块目标函数转化为两块目标函数。然后,我们提出了一种加速近端交替方向乘法器方法(APADMM),该方法增加了更新步骤,该方法基于改变拉格朗日乘法器项更新的新PADMM。进一步给出了该算法的0 (1/K)非遍历收敛率分析。最后,数值结果表明,所提出的新型ADMM和APADMM比现有的ADMM解具有更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An accelerated proximal alternating direction method of multipliers for robust fused Lasso
In the era of big data, much of the data is susceptible to noise with heavy-tailed distribution. Fused Lasso can effectively handle high dimensional sparse data with strong correlation between two adjacent variables under known Gaussian noise. However, it has poor robustness to nonGaussian noise with heavy-tailed distribution. Robust fused Lasso with l1 norm loss function can overcome the drawback of fused Lasso when noise is heavy-tailed distribution. But the key challenge for solving this model is nonsmoothness and its nonseparability. Therefore, in this paper, we first deform the robust fused Lasso into an easily solvable form, which changes the three-block objective function to a two-block form. Then, we propose an accelerated proximal alternating direction method of multipliers (APADMM) with an additional update step, which is base on a new PADMM that changes the Lagrangian multiplier term update. Furthermore, we give the O(1/K) nonergodic convergence rate analysis of the proposed APADMM. Finally, numerical results show that the proposed new PADMM and APADMM have better performance than other existing ADMM solvers.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信