Difference-of-Convex Algorithm with Extrapolation for Nonconvex, Nonsmooth Optimization Problems

IF 1.9 3区 数学 Q2 MATHEMATICS, APPLIED
Duy Nhat Phan, Hoai An Le Thi
{"title":"Difference-of-Convex Algorithm with Extrapolation for Nonconvex, Nonsmooth Optimization Problems","authors":"Duy Nhat Phan, Hoai An Le Thi","doi":"10.1287/moor.2020.0393","DOIUrl":null,"url":null,"abstract":"In this paper, we focus on the problem of minimizing the sum of a nonconvex differentiable function and a difference of convex (DC) function, where the differentiable function is not restricted to the global Lipschitz gradient continuity assumption. This problem covers a broad range of applications in machine learning and statistics, such as compressed sensing, signal recovery, sparse dictionary learning, matrix factorization, etc. We first take inspiration from the Nesterov acceleration technique and the DC algorithm to develop a novel algorithm for the considered problem. We then study the subsequential convergence of our algorithm to a critical point. Furthermore, we justify the global convergence of the whole sequence generated by our algorithm to a critical point and establish its convergence rate under the Kurdyka–Łojasiewicz condition. Numerical experiments on the nonnegative matrix completion problem are performed to demonstrate the efficiency of our algorithm and its superiority over well-known methods.","PeriodicalId":49852,"journal":{"name":"Mathematics of Operations Research","volume":"36 1","pages":"0"},"PeriodicalIF":1.9000,"publicationDate":"2023-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mathematics of Operations Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1287/moor.2020.0393","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

In this paper, we focus on the problem of minimizing the sum of a nonconvex differentiable function and a difference of convex (DC) function, where the differentiable function is not restricted to the global Lipschitz gradient continuity assumption. This problem covers a broad range of applications in machine learning and statistics, such as compressed sensing, signal recovery, sparse dictionary learning, matrix factorization, etc. We first take inspiration from the Nesterov acceleration technique and the DC algorithm to develop a novel algorithm for the considered problem. We then study the subsequential convergence of our algorithm to a critical point. Furthermore, we justify the global convergence of the whole sequence generated by our algorithm to a critical point and establish its convergence rate under the Kurdyka–Łojasiewicz condition. Numerical experiments on the nonnegative matrix completion problem are performed to demonstrate the efficiency of our algorithm and its superiority over well-known methods.
非凸、非光滑优化问题的外推凸差分算法
本文研究了非凸可微函数与凸差分函数和的最小化问题,其中可微函数不受全局Lipschitz梯度连续性假设的限制。该问题涵盖了机器学习和统计学中的广泛应用,如压缩感知、信号恢复、稀疏字典学习、矩阵分解等。我们首先从Nesterov加速技术和DC算法中获得灵感,为所考虑的问题开发了一种新的算法。然后,我们研究了算法在一个临界点上的次收敛性。进一步证明了算法生成的整个序列全局收敛到一个临界点,并在Kurdyka -Łojasiewicz条件下建立了其收敛速率。对非负矩阵补全问题进行了数值实验,验证了该算法的有效性及其相对于现有方法的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Mathematics of Operations Research
Mathematics of Operations Research 管理科学-应用数学
CiteScore
3.40
自引率
5.90%
发文量
178
审稿时长
15.0 months
期刊介绍: Mathematics of Operations Research is an international journal of the Institute for Operations Research and the Management Sciences (INFORMS). The journal invites articles concerned with the mathematical and computational foundations in the areas of continuous, discrete, and stochastic optimization; mathematical programming; dynamic programming; stochastic processes; stochastic models; simulation methodology; control and adaptation; networks; game theory; and decision theory. Also sought are contributions to learning theory and machine learning that have special relevance to decision making, operations research, and management science. The emphasis is on originality, quality, and importance; correctness alone is not sufficient. Significant developments in operations research and management science not having substantial mathematical interest should be directed to other journals such as Management Science or Operations Research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信