An aggressive reduction on the complexity of optimization for non-strongly convex objectives

Zhijian Luo, Siyu Chen, Yueen Hou, Yanzeng Gao, Y. Qian
{"title":"An aggressive reduction on the complexity of optimization for non-strongly convex objectives","authors":"Zhijian Luo, Siyu Chen, Yueen Hou, Yanzeng Gao, Y. Qian","doi":"10.1142/s0219691323500170","DOIUrl":null,"url":null,"abstract":"Tremendous efficient optimization methods have been proposed for strongly convex objectives optimization in modern machine learning. For non-strongly convex objectives, a popular approach is to apply a reduction from non-strongly convex to a strongly convex case via regularization techniques. Reduction on objectives with adaptive decrease on regularization tightens the optimal convergence of algorithms to be independent on logarithm factor. However, the initialization of parameter of regularization has a great impact on the performance of the reduction. In this paper, we propose an aggressive reduction to reduce the complexity of optimization for non-strongly convex objectives, and our reduction eliminates the impact of the initialization of parameter on the convergent performances of algorithms. Aggressive reduction not only adaptively decreases the regularization parameter, but also modifies regularization term as the distance between current point and the approximate minimizer. Our aggressive reduction can also shave off the non-optimal logarithm term theoretically, and make the convergent performance of algorithm more compact practically. Experimental results on logistic regression and image deblurring confirm this success in practice.","PeriodicalId":158567,"journal":{"name":"Int. J. Wavelets Multiresolution Inf. Process.","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Wavelets Multiresolution Inf. Process.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s0219691323500170","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Tremendous efficient optimization methods have been proposed for strongly convex objectives optimization in modern machine learning. For non-strongly convex objectives, a popular approach is to apply a reduction from non-strongly convex to a strongly convex case via regularization techniques. Reduction on objectives with adaptive decrease on regularization tightens the optimal convergence of algorithms to be independent on logarithm factor. However, the initialization of parameter of regularization has a great impact on the performance of the reduction. In this paper, we propose an aggressive reduction to reduce the complexity of optimization for non-strongly convex objectives, and our reduction eliminates the impact of the initialization of parameter on the convergent performances of algorithms. Aggressive reduction not only adaptively decreases the regularization parameter, but also modifies regularization term as the distance between current point and the approximate minimizer. Our aggressive reduction can also shave off the non-optimal logarithm term theoretically, and make the convergent performance of algorithm more compact practically. Experimental results on logistic regression and image deblurring confirm this success in practice.
对非强凸目标优化复杂度的积极降低
在现代机器学习中,针对强凸目标优化问题提出了大量高效的优化方法。对于非强凸目标,一种流行的方法是通过正则化技术从非强凸应用到强凸情况。通过正则化的自适应减简使算法的最优收敛性不依赖于对数因子。然而,正则化参数的初始化对约简的性能有很大的影响。在本文中,我们提出了一种主动约简来降低非强凸目标优化的复杂性,并且我们的约简消除了参数初始化对算法收敛性能的影响。主动约简不仅自适应地减小正则化参数,而且将正则化项修改为当前点与近似最小值之间的距离。我们的主动约简还可以从理论上去除非最优对数项,使算法的收敛性能在实践中更加紧凑。逻辑回归和图像去模糊的实验结果在实践中证实了这种方法的成功。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信