Raphael Chinchilla, Guosong Yang, João P. Hespanha
{"title":"Newton and interior-point methods for (constrained) nonconvex–nonconcave minmax optimization with stability and instability guarantees","authors":"Raphael Chinchilla, Guosong Yang, João P. Hespanha","doi":"10.1007/s00498-023-00371-4","DOIUrl":null,"url":null,"abstract":"Abstract We address the problem of finding a local solution to a nonconvex–nonconcave minmax optimization using Newton type methods, including primal-dual interior-point ones. The first step in our approach is to analyze the local convergence properties of Newton’s method in nonconvex minimization. It is well established that Newton’s method iterations are attracted to any point with a zero gradient, irrespective of it being a local minimum. From a dynamical system standpoint, this occurs because every point for which the gradient is zero is a locally asymptotically stable equilibrium point. We show that by adding a multiple of the identity such that the Hessian matrix is always positive definite, we can ensure that every non-local-minimum equilibrium point becomes unstable (meaning that the iterations are no longer attracted to such points), while local minima remain locally asymptotically stable. Building on this foundation, we develop Newton-type algorithms for minmax optimization, conceptualized as a sequence of local quadratic approximations for the minmax problem. Using a local quadratic approximation serves as a surrogate for guiding the modified Newton’s method toward a solution. For these local quadratic approximations to be well-defined, it is necessary to modify the Hessian matrix by adding a diagonal matrix. We demonstrate that, for an appropriate choice of this diagonal matrix, we can guarantee the instability of every non-local-minmax equilibrium point while maintaining stability for local minmax points. Using numerical examples, we illustrate the importance of guaranteeing the instability property. While our results are about local convergence, the numerical examples also indicate that our algorithm enjoys good global convergence properties.","PeriodicalId":51123,"journal":{"name":"Mathematics of Control Signals and Systems","volume":"5 1","pages":"0"},"PeriodicalIF":1.8000,"publicationDate":"2023-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mathematics of Control Signals and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00498-023-00371-4","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Abstract We address the problem of finding a local solution to a nonconvex–nonconcave minmax optimization using Newton type methods, including primal-dual interior-point ones. The first step in our approach is to analyze the local convergence properties of Newton’s method in nonconvex minimization. It is well established that Newton’s method iterations are attracted to any point with a zero gradient, irrespective of it being a local minimum. From a dynamical system standpoint, this occurs because every point for which the gradient is zero is a locally asymptotically stable equilibrium point. We show that by adding a multiple of the identity such that the Hessian matrix is always positive definite, we can ensure that every non-local-minimum equilibrium point becomes unstable (meaning that the iterations are no longer attracted to such points), while local minima remain locally asymptotically stable. Building on this foundation, we develop Newton-type algorithms for minmax optimization, conceptualized as a sequence of local quadratic approximations for the minmax problem. Using a local quadratic approximation serves as a surrogate for guiding the modified Newton’s method toward a solution. For these local quadratic approximations to be well-defined, it is necessary to modify the Hessian matrix by adding a diagonal matrix. We demonstrate that, for an appropriate choice of this diagonal matrix, we can guarantee the instability of every non-local-minmax equilibrium point while maintaining stability for local minmax points. Using numerical examples, we illustrate the importance of guaranteeing the instability property. While our results are about local convergence, the numerical examples also indicate that our algorithm enjoys good global convergence properties.
期刊介绍:
Mathematics of Control, Signals, and Systems (MCSS) is an international journal devoted to mathematical control and system theory, including system theoretic aspects of signal processing.
Its unique feature is its focus on mathematical system theory; it concentrates on the mathematical theory of systems with inputs and/or outputs and dynamics that are typically described by deterministic or stochastic ordinary or partial differential equations, differential algebraic equations or difference equations.
Potential topics include, but are not limited to controllability, observability, and realization theory, stability theory of nonlinear systems, system identification, mathematical aspects of switched, hybrid, networked, and stochastic systems, and system theoretic aspects of optimal control and other controller design techniques. Application oriented papers are welcome if they contain a significant theoretical contribution.