D. Pozharskiy, Noah J. Wichrowski, A. Duncan, G. Pavliotis, I. Kevrekidis
{"title":"Manifold learning for accelerating coarse-grained optimization","authors":"D. Pozharskiy, Noah J. Wichrowski, A. Duncan, G. Pavliotis, I. Kevrekidis","doi":"10.3934/jcd.2020021","DOIUrl":null,"url":null,"abstract":"Algorithms proposed for solving high-dimensional optimization problems with no derivative information frequently encounter the \"curse of dimensionality,\" becoming ineffective as the dimension of the parameter space grows. One feature of a subclass of such problems that are effectively low-dimensional is that only a few parameters (or combinations thereof) are important for the optimization and must be explored in detail. Knowing these parameters/ combinations in advance would greatly simplify the problem and its solution. We propose the data-driven construction of an effective (coarse-grained, \"trend\") optimizer, based on data obtained from ensembles of brief simulation bursts with an \"inner\" optimization algorithm, that has the potential to accelerate the exploration of the parameter space. The trajectories of this \"effective optimizer\" quickly become attracted onto a slow manifold parameterized by the few relevant parameter combinations. We obtain the parameterization of this low-dimensional, effective optimization manifold on the fly using data mining/manifold learning techniques on the results of simulation (inner optimizer iteration) burst ensembles and exploit it locally to \"jump\" forward along this manifold. As a result, we can bias the exploration of the parameter space towards the few, important directions and, through this \"wrapper algorithm,\" speed up the convergence of traditional optimization algorithms.","PeriodicalId":37526,"journal":{"name":"Journal of Computational Dynamics","volume":"32 1","pages":""},"PeriodicalIF":1.0000,"publicationDate":"2020-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Dynamics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3934/jcd.2020021","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Engineering","Score":null,"Total":0}
Manifold learning for accelerating coarse-grained optimization
Algorithms proposed for solving high-dimensional optimization problems with no derivative information frequently encounter the "curse of dimensionality," becoming ineffective as the dimension of the parameter space grows. One feature of a subclass of such problems that are effectively low-dimensional is that only a few parameters (or combinations thereof) are important for the optimization and must be explored in detail. Knowing these parameters/ combinations in advance would greatly simplify the problem and its solution. We propose the data-driven construction of an effective (coarse-grained, "trend") optimizer, based on data obtained from ensembles of brief simulation bursts with an "inner" optimization algorithm, that has the potential to accelerate the exploration of the parameter space. The trajectories of this "effective optimizer" quickly become attracted onto a slow manifold parameterized by the few relevant parameter combinations. We obtain the parameterization of this low-dimensional, effective optimization manifold on the fly using data mining/manifold learning techniques on the results of simulation (inner optimizer iteration) burst ensembles and exploit it locally to "jump" forward along this manifold. As a result, we can bias the exploration of the parameter space towards the few, important directions and, through this "wrapper algorithm," speed up the convergence of traditional optimization algorithms.
期刊介绍:
JCD is focused on the intersection of computation with deterministic and stochastic dynamics. The mission of the journal is to publish papers that explore new computational methods for analyzing dynamic problems or use novel dynamical methods to improve computation. The subject matter of JCD includes both fundamental mathematical contributions and applications to problems from science and engineering. A non-exhaustive list of topics includes * Computation of phase-space structures and bifurcations * Multi-time-scale methods * Structure-preserving integration * Nonlinear and stochastic model reduction * Set-valued numerical techniques * Network and distributed dynamics JCD includes both original research and survey papers that give a detailed and illuminating treatment of an important area of current interest. The editorial board of JCD consists of world-leading researchers from mathematics, engineering, and science, all of whom are experts in both computational methods and the theory of dynamical systems.