一种恒步长次梯度方法用于$$\ell _1$$ -复合优化

A. Scagliotti, P. Colli Franzone
{"title":"一种恒步长次梯度方法用于$$\\ell _1$$ -复合优化","authors":"A. Scagliotti, P. Colli Franzone","doi":"10.1007/s40574-023-00389-1","DOIUrl":null,"url":null,"abstract":"Abstract Subgradient methods are the natural extension to the non-smooth case of the classical gradient descent for regular convex optimization problems. However, in general, they are characterized by slow convergence rates, and they require decreasing step-sizes to converge. In this paper we propose a subgradient method with constant step-size for composite convex objectives with $$\\ell _1$$ <mml:math xmlns:mml=\"http://www.w3.org/1998/Math/MathML\"> <mml:msub> <mml:mi>ℓ</mml:mi> <mml:mn>1</mml:mn> </mml:msub> </mml:math> -regularization. If the smooth term is strongly convex, we can establish a linear convergence result for the function values. This fact relies on an accurate choice of the element of the subdifferential used for the update, and on proper actions adopted when non-differentiability regions are crossed. Then, we propose an accelerated version of the algorithm, based on conservative inertial dynamics and on an adaptive restart strategy, that is guaranteed to achieve a linear convergence rate in the strongly convex case. Finally, we test the performances of our algorithms on some strongly and non-strongly convex examples.","PeriodicalId":214688,"journal":{"name":"Bollettino dell'Unione Matematica Italiana","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A subgradient method with constant step-size for $$\\\\ell _1$$-composite optimization\",\"authors\":\"A. Scagliotti, P. Colli Franzone\",\"doi\":\"10.1007/s40574-023-00389-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Subgradient methods are the natural extension to the non-smooth case of the classical gradient descent for regular convex optimization problems. However, in general, they are characterized by slow convergence rates, and they require decreasing step-sizes to converge. In this paper we propose a subgradient method with constant step-size for composite convex objectives with $$\\\\ell _1$$ <mml:math xmlns:mml=\\\"http://www.w3.org/1998/Math/MathML\\\"> <mml:msub> <mml:mi>ℓ</mml:mi> <mml:mn>1</mml:mn> </mml:msub> </mml:math> -regularization. If the smooth term is strongly convex, we can establish a linear convergence result for the function values. This fact relies on an accurate choice of the element of the subdifferential used for the update, and on proper actions adopted when non-differentiability regions are crossed. Then, we propose an accelerated version of the algorithm, based on conservative inertial dynamics and on an adaptive restart strategy, that is guaranteed to achieve a linear convergence rate in the strongly convex case. Finally, we test the performances of our algorithms on some strongly and non-strongly convex examples.\",\"PeriodicalId\":214688,\"journal\":{\"name\":\"Bollettino dell'Unione Matematica Italiana\",\"volume\":\"40 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-10-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Bollettino dell'Unione Matematica Italiana\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s40574-023-00389-1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bollettino dell'Unione Matematica Italiana","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s40574-023-00389-1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

子梯度法是正则凸优化问题非光滑情况下经典梯度下降法的自然推广。然而,一般来说,它们的特点是收敛速度慢,并且需要减小步长才能收敛。本文针对具有$$\ell _1$$ - 1正则化的复合凸目标,提出了一种恒步长次梯度方法。如果光滑项是强凸的,我们可以建立函数值的线性收敛结果。这一事实依赖于用于更新的子微分元素的准确选择,以及当不可微分区域交叉时采取的适当行动。然后,我们提出了一个加速版本的算法,基于保守惯性动力学和自适应重启策略,保证在强凸情况下实现线性收敛速率。最后,我们在一些强和非强凸例子上测试了算法的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A subgradient method with constant step-size for $$\ell _1$$-composite optimization
Abstract Subgradient methods are the natural extension to the non-smooth case of the classical gradient descent for regular convex optimization problems. However, in general, they are characterized by slow convergence rates, and they require decreasing step-sizes to converge. In this paper we propose a subgradient method with constant step-size for composite convex objectives with $$\ell _1$$ 1 -regularization. If the smooth term is strongly convex, we can establish a linear convergence result for the function values. This fact relies on an accurate choice of the element of the subdifferential used for the update, and on proper actions adopted when non-differentiability regions are crossed. Then, we propose an accelerated version of the algorithm, based on conservative inertial dynamics and on an adaptive restart strategy, that is guaranteed to achieve a linear convergence rate in the strongly convex case. Finally, we test the performances of our algorithms on some strongly and non-strongly convex examples.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信