Gangfan Zhong, Xiaozhe Hu, Ming Tang, Liuqiang Zhong
{"title":"Fast Convex Optimization via Differential Equation with Hessian-Driven Damping and Tikhonov Regularization","authors":"Gangfan Zhong, Xiaozhe Hu, Ming Tang, Liuqiang Zhong","doi":"10.1007/s10957-024-02462-x","DOIUrl":null,"url":null,"abstract":"<p>In this paper, we consider a class of second-order ordinary differential equations with Hessian-driven damping and Tikhonov regularization, which arises from the minimization of a smooth convex function in Hilbert spaces. Inspired by Attouch et al. (J Differ Equ 261:5734–5783, 2016), we establish that the function value along the solution trajectory converges to the optimal value, and prove that the convergence rate can be as fast as <span>\\(o(1/t^2)\\)</span>. By constructing proper energy function, we prove that the trajectory strongly converges to a minimizer of the objective function of minimum norm. Moreover, we propose a gradient-based optimization algorithm based on numerical discretization, and demonstrate its effectiveness in numerical experiments.</p>","PeriodicalId":50100,"journal":{"name":"Journal of Optimization Theory and Applications","volume":"33 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Optimization Theory and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10957-024-02462-x","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we consider a class of second-order ordinary differential equations with Hessian-driven damping and Tikhonov regularization, which arises from the minimization of a smooth convex function in Hilbert spaces. Inspired by Attouch et al. (J Differ Equ 261:5734–5783, 2016), we establish that the function value along the solution trajectory converges to the optimal value, and prove that the convergence rate can be as fast as \(o(1/t^2)\). By constructing proper energy function, we prove that the trajectory strongly converges to a minimizer of the objective function of minimum norm. Moreover, we propose a gradient-based optimization algorithm based on numerical discretization, and demonstrate its effectiveness in numerical experiments.
期刊介绍:
The Journal of Optimization Theory and Applications is devoted to the publication of carefully selected regular papers, invited papers, survey papers, technical notes, book notices, and forums that cover mathematical optimization techniques and their applications to science and engineering. Typical theoretical areas include linear, nonlinear, mathematical, and dynamic programming. Among the areas of application covered are mathematical economics, mathematical physics and biology, and aerospace, chemical, civil, electrical, and mechanical engineering.