{"title":"Multistart algorithm for identifying all optima of nonconvex stochastic functions","authors":"Prateek Jaiswal, Jeffrey Larson","doi":"10.1007/s11590-024-02114-z","DOIUrl":null,"url":null,"abstract":"<p>We propose a multistart algorithm to identify all local minima of a constrained, nonconvex stochastic optimization problem. The algorithm uniformly samples points in the domain and then starts a local stochastic optimization run from any point that is the “probabilistically best” point in its neighborhood. Under certain conditions, our algorithm is shown to asymptotically identify all local optima with high probability; this holds even though our algorithm is shown to almost surely start only finitely many local stochastic optimization runs. We demonstrate the performance of an implementation of our algorithm on nonconvex stochastic optimization problems, including identifying optimal variational parameters for the quantum approximate optimization algorithm.</p>","PeriodicalId":49720,"journal":{"name":"Optimization Letters","volume":"8 1","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2024-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization Letters","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s11590-024-02114-z","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
We propose a multistart algorithm to identify all local minima of a constrained, nonconvex stochastic optimization problem. The algorithm uniformly samples points in the domain and then starts a local stochastic optimization run from any point that is the “probabilistically best” point in its neighborhood. Under certain conditions, our algorithm is shown to asymptotically identify all local optima with high probability; this holds even though our algorithm is shown to almost surely start only finitely many local stochastic optimization runs. We demonstrate the performance of an implementation of our algorithm on nonconvex stochastic optimization problems, including identifying optimal variational parameters for the quantum approximate optimization algorithm.
期刊介绍:
Optimization Letters is an international journal covering all aspects of optimization, including theory, algorithms, computational studies, and applications, and providing an outlet for rapid publication of short communications in the field. Originality, significance, quality and clarity are the essential criteria for choosing the material to be published.
Optimization Letters has been expanding in all directions at an astonishing rate during the last few decades. New algorithmic and theoretical techniques have been developed, the diffusion into other disciplines has proceeded at a rapid pace, and our knowledge of all aspects of the field has grown even more profound. At the same time one of the most striking trends in optimization is the constantly increasing interdisciplinary nature of the field.
Optimization Letters aims to communicate in a timely fashion all recent developments in optimization with concise short articles (limited to a total of ten journal pages). Such concise articles will be easily accessible by readers working in any aspects of optimization and wish to be informed of recent developments.