Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII最新文献

筛选
英文 中文
Parallel Evolutionary Algorithms Performing Pairwise Comparisons 并行进化算法执行两两比较
M. Cauwet, O. Teytaud, Shih-Yuan Chiu, Kuo-Min Lin, Shi-Jim Yen, D. St-Pierre, F. Teytaud
{"title":"Parallel Evolutionary Algorithms Performing Pairwise Comparisons","authors":"M. Cauwet, O. Teytaud, Shih-Yuan Chiu, Kuo-Min Lin, Shi-Jim Yen, D. St-Pierre, F. Teytaud","doi":"10.1145/2725494.2725499","DOIUrl":"https://doi.org/10.1145/2725494.2725499","url":null,"abstract":"We study mathematically and experimentally the convergence rate of differential evolution and particle swarm optimization for simple unimodal functions. Due to parallelization concerns, the focus is on lower bounds on the runtime, i.e. upper bounds on the speed-up, as a function of the population size. Two cases are particularly relevant: A population size of the same order of magnitude as the dimension and larger population sizes. We use the branching factor as a tool for proving bounds and get, as upper bounds, a linear speed-up for a population size similar to the dimension, and a logarithmic speed-up for larger population sizes. We then propose parametrizations for differential evolution and particle swarm optimization that reach these bounds.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129057069","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Understanding Simple Asynchronous Evolutionary Algorithms 理解简单的异步进化算法
Eric O. Scott, K. D. Jong
{"title":"Understanding Simple Asynchronous Evolutionary Algorithms","authors":"Eric O. Scott, K. D. Jong","doi":"10.1145/2725494.2725509","DOIUrl":"https://doi.org/10.1145/2725494.2725509","url":null,"abstract":"In many applications of evolutionary algorithms, the time required to evaluate the fitness of individuals is long and variable. When the variance in individual evaluation times is non-negligible, traditional, synchronous master-slave EAs incur idle time in CPU resources. An asynchronous approach to parallelization of EAs promises to eliminate idle time and thereby to reduce the amount of wall-clock time it takes to solve a problem. However, the behavior of asynchronous evolutionary algorithms is not well understood. In particular, it is not clear exactly how much faster the asynchronous algorithm will tend to run, or whether its evolutionary trajectory may follow a sub-optimal search path that cancels out the promised benefits. This paper presents a preliminary analysis of simple asynchronous EA performance in terms of speed and problem-solving ability.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121662107","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 36
Hypomixability Elimination In Evolutionary Systems 进化系统中的低混性消除
Keki M. Burjorjee
{"title":"Hypomixability Elimination In Evolutionary Systems","authors":"Keki M. Burjorjee","doi":"10.1145/2725494.2725511","DOIUrl":"https://doi.org/10.1145/2725494.2725511","url":null,"abstract":"Hypomixability Elimination is an intriguing form of computation thought to underlie general-purpose, non-local, noise-tolerant adaptation in recombinative evolutionary systems. We demonstrate that hypomixability elimination in recombinative evolutionary systems can be efficient by using it to obtain optimal bounds on the time and queries required to solve a subclass (k=7, η=1/5) of a familiar computational learning problem: PAC-learning parities with noisy membership queries; where k is the number of relevant attributes and η is the oracle's noise rate. Specifically, we show that a simple genetic algorithm with uniform crossover (free recombination) that treats the noisy membership query oracle as a fitness function can be rigged to PAC-learn the relevant variables in O(log (n/δ)) queries and O(n log (n/δ)) time, where n is the total number of attributes and δ is the probability of error. To the best of our knowledge, this is the first time optimally efficient computation has been shown to occur in, an evolutionary algorithm, on a non-trivial problem. The optimality result and indeed the implicit implementation of hypomixability elimination by a simple genetic algorithm depends crucially on recombination. This dependence yields a fresh, unified explanation for sex, adaptation, speciation, and the emergence of modularity in evolutionary systems. Compared to other explanations, Hypomixability Theory is exceedingly parsimonious. For example, it does not assume deleterious mutation, a changing fitness landscape, or the existence of building blocks.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115870706","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Self-Adapting the Brownian Radius in a Differential Evolution Algorithm for Dynamic Environments 动态环境下微分进化算法中的布朗半径自适应
M. D. Plessis, A. Engelbrecht, A. Calitz
{"title":"Self-Adapting the Brownian Radius in a Differential Evolution Algorithm for Dynamic Environments","authors":"M. D. Plessis, A. Engelbrecht, A. Calitz","doi":"10.1145/2725494.2725505","DOIUrl":"https://doi.org/10.1145/2725494.2725505","url":null,"abstract":"Several algorithms aimed at dynamic optimisation problems have been developed. This paper reports on the incorporation of a self-adaptive Brownian radius into competitive differential evolution (CDE). Four variations of a novel technique to achieving the self-adaptation is suggested and motivated. An experimental investigation over a large number of benchmark instances is used to determine the most effective of the four variations. The new algorithm is compared to its base algorithm on an extensive set of benchmark problems and its performance analysed. Finally, the new algorithm is compared to other algorithms by means of reported results found in the literature. The results indicate that CDE is improved the the incorporation of the self-adaptive Brownian radius and that the new algorithm compares well with other algorithms.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131729326","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Information Geometry of the Gaussian Distribution in View of Stochastic Optimization 随机优化下高斯分布的信息几何
Luigi Malagò, Giovanni Pistone
{"title":"Information Geometry of the Gaussian Distribution in View of Stochastic Optimization","authors":"Luigi Malagò, Giovanni Pistone","doi":"10.1145/2725494.2725510","DOIUrl":"https://doi.org/10.1145/2725494.2725510","url":null,"abstract":"We study the optimization of a continuous function by its stochastic relaxation, i.e., the optimization of the expected value of the function itself with respect to a density in a statistical model. We focus on gradient descent techniques applied to models from the exponential family and in particular on the multivariate Gaussian distribution. From the theory of the exponential family, we reparametrize the Gaussian distribution using natural and expectation parameters, and we derive formulas for natural gradients in both parameterizations. We discuss some advantages of the natural parameterization for the identification of sub-models in the Gaussian distribution based on conditional independence assumptions among variables. Gaussian distributions are widely used in stochastic optimization and in particular in model-based Evolutionary Computation, as in Estimation of Distribution Algorithms and Evolutionary Strategies. By studying natural gradient flows over Gaussian distributions our analysis and results directly apply to the study of CMA-ES and NES algorithms.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117332590","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 50
Evolution Strategies with Additive Noise: A Convergence Rate Lower Bound 具有加性噪声的进化策略:收敛速率下界
S. Morales, M. Cauwet, O. Teytaud
{"title":"Evolution Strategies with Additive Noise: A Convergence Rate Lower Bound","authors":"S. Morales, M. Cauwet, O. Teytaud","doi":"10.1145/2725494.2725500","DOIUrl":"https://doi.org/10.1145/2725494.2725500","url":null,"abstract":"We consider the problem of optimizing functions corrupted with additive noise. It is known that Evolutionary Algorithms can reach a Simple Regret O(1/√n) within logarithmic factors, when n is the number of function evaluations. Here, Simple Regret at evaluation $n$ is the difference between the evaluation of the function at the current recommendation point of the algorithm and at the real optimum. We show mathematically that this bound is tight, for any family of functions that includes sphere functions, at least for a wide set of Evolution Strategies without large mutations.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"223 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122900290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Run-Time Analysis of Population-Based Evolutionary Algorithm in Noisy Environments 噪声环境下基于种群的进化算法运行时分析
A. Prügel-Bennett, J. Rowe, J. Shapiro
{"title":"Run-Time Analysis of Population-Based Evolutionary Algorithm in Noisy Environments","authors":"A. Prügel-Bennett, J. Rowe, J. Shapiro","doi":"10.1145/2725494.2725498","DOIUrl":"https://doi.org/10.1145/2725494.2725498","url":null,"abstract":"This paper analyses a generational evolutionary algorithm using only selection and uniform crossover. With a probability arbitrarily close to one the evolutionary algorithm is shown to solve onemax in O(n log2(n)) function evaluations using a population of size c,n, log(n). We then show that this algorithm can solve onemax with noise variance n again in O(n log2(n)) function evaluations.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125529801","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 31
A More Efficient Rank-one Covariance Matrix Update for Evolution Strategies 进化策略中一种更有效的秩一协方差矩阵更新
Oswin Krause, C. Igel
{"title":"A More Efficient Rank-one Covariance Matrix Update for Evolution Strategies","authors":"Oswin Krause, C. Igel","doi":"10.1145/2725494.2725496","DOIUrl":"https://doi.org/10.1145/2725494.2725496","url":null,"abstract":"Learning covariance matrices of Gaussian distributions is at the heart of most variable-metric randomized algorithms for continuous optimization. If the search space dimensionality is high, updating the covariance or its factorization is computationally expensive. Therefore, we adopt an algorithm from numerical mathematics for rank-one updates of Cholesky factors. Our methods results in a quadratic time covariance matrix update scheme with minimal memory requirements. The numerically stable algorithm leads to triangular Cholesky factors. Systems of linear equations where the linear transformation is defined by a triangular matrix can be solved in quadratic time. This can be exploited to avoid the additional iterative update of the inverse Cholesky factor required in some covariance matrix adaptation algorithms proposed in the literature. When used together with the (1+1)-CMA-ES and the multi-objective CMA-ES, the new method leads to a memory reduction by a factor of almost four and a faster covariance matrix update. The numerical stability and runtime improvements are demonstrated on a set of benchmark functions.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"2015 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134297432","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 25
Efficient Optimisation of Noisy Fitness Functions with Population-based Evolutionary Algorithms 基于种群进化算法的噪声适应度函数高效优化
D. Dang, P. Lehre
{"title":"Efficient Optimisation of Noisy Fitness Functions with Population-based Evolutionary Algorithms","authors":"D. Dang, P. Lehre","doi":"10.1145/2725494.2725508","DOIUrl":"https://doi.org/10.1145/2725494.2725508","url":null,"abstract":"Population-based EAs can optimise pseudo-Boolean functions in expected polynomial time, even when only partial information about the problem is available [7]. In this paper, we show that the approach used to analyse optimisation with partial information extends naturally to optimisation under noise. We consider pseudo-Boolean problems with an additive noise term. Very general conditions on the noise term is derived, under which the EA optimises the noisy function in expected polynomial time. In the case of the Onemax and Leadingones problems, efficient optimisation is even possible when the variance of the noise distribution grows quickly with the problem size.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128118257","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 39
Evolutionary Dynamics on Graphs: Invited Talk 图上的进化动力学:特邀演讲
L. A. Goldberg
{"title":"Evolutionary Dynamics on Graphs: Invited Talk","authors":"L. A. Goldberg","doi":"10.1145/2725494.2725495","DOIUrl":"https://doi.org/10.1145/2725494.2725495","url":null,"abstract":"The Moran process [5], as adapted by Lieberman, Hauert and Nowak [4], is a discrete-time random process which models the spread of genetic mutations through populations. Individuals are modelled as the vertices of a graph. Each vertex is either infected or uninfected. The model has a parameter r > 0. Infected vertices have fitness r and uninfected vertices have fitness 1. At each step, an individual is selected to reproduce with probability proportional to its fitness. This vertex chooses one of its neighbours uniformly at random and updates the state of that neighbour (infected or not) to match its own. In the initial state, one vertex is chosen uniformly at random to be infected and the other vertices are uninfected. If the graph is strongly connected then the process will terminate with probability 1, either in the state where every vertex is infected (known as fixation) or in the state where no vertex is infected (known as extinction). The principal quantities of interest are the fixation probability (the probability of reaching fixation) and the expected absorption time (the expected number of steps before fixation or extinction is reached). In general, these depend on both the graph topology and the parameter r. We study three questions.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"139 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121590058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信