{"title":"Swarm-Based Gradient Descent Meets Simulated Annealing","authors":"Zhiyan Ding, Martin Guerra, Qin Li, Eitan Tadmor","doi":"10.1137/24m1657808","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Numerical Analysis, Volume 62, Issue 6, Page 2745-2781, December 2024. <br/> Abstract. We introduce a novel method, called swarm-based simulated annealing (SSA), for nonconvex optimization which is at the interface between the swarm-based gradient-descent (SBGD) [J. Lu et al., arXiv:2211.17157; E. Tadmor and A. Zenginoglu, Acta Appl. Math., 190 (2024)] and simulated annealing (SA) [V. Cerny, J. Optim. Theory Appl., 45 (1985), pp. 41–51; S. Kirkpatrick et al., Science, 220 (1983), pp. 671–680; S. Geman and C.-R. Hwang, SIAM J. Control Optim., 24 (1986), pp. 1031–1043]. Similarly to SBGD, we introduce a swarm of agents, each identified with a position, [math] and mass [math], to explore the ambient space. Similarly to SA, the agents proceed in the gradient descent direction, and are subject to Brownian motion. The annealing rate, however, is dictated by a decreasing function of their mass. As a consequence, instead of the SA protocol for time-decreasing temperature, here the swarm decides how to “cool down” agents, depending on their own accumulated mass. The dynamics of masses is coupled with the dynamics of positions: agents at higher ground transfer (part of) their mass to those at lower ground. Consequently, the resulting SSA optimizer is dynamically divided between heavier, cooler agents viewed as “leaders” and lighter, warmer agents viewed as “explorers.” Mean-field convergence analysis and benchmark optimizations demonstrate the effectiveness of the SSA method as a multidimensional global optimizer.","PeriodicalId":49527,"journal":{"name":"SIAM Journal on Numerical Analysis","volume":"12 1","pages":""},"PeriodicalIF":2.8000,"publicationDate":"2024-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Numerical Analysis","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/24m1657808","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Numerical Analysis, Volume 62, Issue 6, Page 2745-2781, December 2024. Abstract. We introduce a novel method, called swarm-based simulated annealing (SSA), for nonconvex optimization which is at the interface between the swarm-based gradient-descent (SBGD) [J. Lu et al., arXiv:2211.17157; E. Tadmor and A. Zenginoglu, Acta Appl. Math., 190 (2024)] and simulated annealing (SA) [V. Cerny, J. Optim. Theory Appl., 45 (1985), pp. 41–51; S. Kirkpatrick et al., Science, 220 (1983), pp. 671–680; S. Geman and C.-R. Hwang, SIAM J. Control Optim., 24 (1986), pp. 1031–1043]. Similarly to SBGD, we introduce a swarm of agents, each identified with a position, [math] and mass [math], to explore the ambient space. Similarly to SA, the agents proceed in the gradient descent direction, and are subject to Brownian motion. The annealing rate, however, is dictated by a decreasing function of their mass. As a consequence, instead of the SA protocol for time-decreasing temperature, here the swarm decides how to “cool down” agents, depending on their own accumulated mass. The dynamics of masses is coupled with the dynamics of positions: agents at higher ground transfer (part of) their mass to those at lower ground. Consequently, the resulting SSA optimizer is dynamically divided between heavier, cooler agents viewed as “leaders” and lighter, warmer agents viewed as “explorers.” Mean-field convergence analysis and benchmark optimizations demonstrate the effectiveness of the SSA method as a multidimensional global optimizer.
期刊介绍:
SIAM Journal on Numerical Analysis (SINUM) contains research articles on the development and analysis of numerical methods. Topics include the rigorous study of convergence of algorithms, their accuracy, their stability, and their computational complexity. Also included are results in mathematical analysis that contribute to algorithm analysis, and computational results that demonstrate algorithm behavior and applicability.