Efficient low rank matrix recovery with flexible group sparse regularization

IF 2.3 2区 数学 Q1 MATHEMATICS, APPLIED
Quan Yu, Minru Bai, Xinzhen Zhang
{"title":"Efficient low rank matrix recovery with flexible group sparse regularization","authors":"Quan Yu, Minru Bai, Xinzhen Zhang","doi":"10.1093/imanum/drae099","DOIUrl":null,"url":null,"abstract":"In this paper, we present a novel approach to the low rank matrix recovery (LRMR) problem by casting it as a group sparsity problem. Specifically, we propose a flexible group sparse regularizer (FLGSR) that can group any number of matrix columns as a unit, whereas existing methods group each column as a unit. We prove the equivalence between the matrix rank and the FLGSR under some mild conditions, and show that the LRMR problem with either of them has the same global minimizers. We also establish the equivalence between the relaxed and the penalty formulations of the LRMR problem with FLGSR. We then propose an inexact restarted augmented Lagrangian method, which solves each subproblem by an extrapolated linearized alternating minimization method. We analyse the convergence of our method. Remarkably, our method linearizes each group of the variable separately and uses the information of the previous groups to solve the current group within the same iteration step. This strategy enables our algorithm to achieve fast convergence and high performance, which are further improved by the restart technique. Finally, we conduct numerical experiments on both grayscale images and high altitude aerial images to confirm the superiority of the proposed FLGSR and algorithm.","PeriodicalId":56295,"journal":{"name":"IMA Journal of Numerical Analysis","volume":"96 1","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2025-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IMA Journal of Numerical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1093/imanum/drae099","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we present a novel approach to the low rank matrix recovery (LRMR) problem by casting it as a group sparsity problem. Specifically, we propose a flexible group sparse regularizer (FLGSR) that can group any number of matrix columns as a unit, whereas existing methods group each column as a unit. We prove the equivalence between the matrix rank and the FLGSR under some mild conditions, and show that the LRMR problem with either of them has the same global minimizers. We also establish the equivalence between the relaxed and the penalty formulations of the LRMR problem with FLGSR. We then propose an inexact restarted augmented Lagrangian method, which solves each subproblem by an extrapolated linearized alternating minimization method. We analyse the convergence of our method. Remarkably, our method linearizes each group of the variable separately and uses the information of the previous groups to solve the current group within the same iteration step. This strategy enables our algorithm to achieve fast convergence and high performance, which are further improved by the restart technique. Finally, we conduct numerical experiments on both grayscale images and high altitude aerial images to confirm the superiority of the proposed FLGSR and algorithm.
基于柔性群稀疏正则化的高效低秩矩阵恢复
本文提出了一种将低秩矩阵恢复(LRMR)问题转化为群稀疏性问题的新方法。具体来说,我们提出了一个灵活的群稀疏正则器(FLGSR),它可以将任意数量的矩阵列分组为一个单元,而现有的方法则将每列分组为一个单元。在一些温和条件下,证明了矩阵秩与FLGSR的等价性,并证明了两者的LRMR问题具有相同的全局极小值。我们还建立了具有FLGSR的LRMR问题的松弛式和惩罚式的等价性。然后,我们提出了一种非精确重新启动增广拉格朗日方法,该方法通过外推线性化交替最小化方法求解每个子问题。我们分析了该方法的收敛性。值得注意的是,我们的方法将变量的每一组分别线性化,并在同一迭代步骤中使用前一组的信息来求解当前组。该策略使我们的算法能够实现快速收敛和高性能,并通过重启技术进一步改进。最后,在灰度图像和高空航拍图像上进行了数值实验,验证了FLGSR和算法的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IMA Journal of Numerical Analysis
IMA Journal of Numerical Analysis 数学-应用数学
CiteScore
5.30
自引率
4.80%
发文量
79
审稿时长
6-12 weeks
期刊介绍: The IMA Journal of Numerical Analysis (IMAJNA) publishes original contributions to all fields of numerical analysis; articles will be accepted which treat the theory, development or use of practical algorithms and interactions between these aspects. Occasional survey articles are also published.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信