Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII最新文献

筛选
英文 中文
Convergence of Strategies in Simple Co-Adapting Games 简单协同适应游戏中的策略聚合
T. Jansen, G. Ochoa, C. Zarges
{"title":"Convergence of Strategies in Simple Co-Adapting Games","authors":"T. Jansen, G. Ochoa, C. Zarges","doi":"10.1145/2725494.2725503","DOIUrl":"https://doi.org/10.1145/2725494.2725503","url":null,"abstract":"Simultaneously co-adapting agents in an uncooperative setting can result in a non-stationary environment where optimisation or learning is difficult and where the agents' strategies may not converge to solutions. This work looks at simple simultaneous-move games with two or three actions and two or three players. Fictitious play is an old but popular algorithm that can converge to solutions, albeit slowly, in self-play in games like these. It models its opponents assuming that they use stationary strategies and plays a best-response strategy to these models. We propose two new variants of fictitious play that remove this assumption and explicitly assume that the opponents use dynamic strategies. The opponent's strategy is predicted using a sequence prediction method in the first variant and a change detection method in the second variant. Empirical results show that our variants converge faster than fictitious play. However, they do not always converge exactly to correct solutions. For change detection, this is a very small number of cases, but for sequence prediction there are many. The convergence of sequence prediction is improved by combining it with fictitious play. Also, unlike in fictitious play, our variants converge to solutions in the difficult Shapley's and Jordan's games.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126953552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
On the Black-Box Complexity of Example Functions: The Real Jump Function 论实例函数的黑箱复杂度:以真实的跳跃函数为例
T. Jansen
{"title":"On the Black-Box Complexity of Example Functions: The Real Jump Function","authors":"T. Jansen","doi":"10.1145/2725494.2725507","DOIUrl":"https://doi.org/10.1145/2725494.2725507","url":null,"abstract":"Black-box complexity measures the difficulty of classes of functions with respect to optimisation by black-box algorithms. Comparing the black-box complexity with the worst case performance of a best know randomised search heuristic can help to assess if the randomised search heuristic is efficient or if there is room for improvement. When considering an example function it is necessary to extend it to a class of functions since single functions always have black-box complexity 1. Different kinds of extensions of single functions to function classes have been considered. In cases where the gap between the performance of the best randomised search heuristic and the black-box complexity is still large it can help to consider more restricted black-box complexity notions like unbiased black-box complexity. For the well-known Jump function neither considering different extensions nor considering more restricted notions of black-box complexity have been successful so far. We argue that the problem is not with the notion of black-box complexity but with the extension to a function class. We propose a different extension and show that for this extension there is a much better agreement even between the performance of an extremely simple evolutionary algorithm and the most general notion of black-box complexity.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128159072","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 23
(1+1) EA on Generalized Dynamic OneMax 广义动态OneMax的(1+1)EA
Timo Kötzing, Andrei Lissovoi, C. Witt
{"title":"(1+1) EA on Generalized Dynamic OneMax","authors":"Timo Kötzing, Andrei Lissovoi, C. Witt","doi":"10.1145/2725494.2725502","DOIUrl":"https://doi.org/10.1145/2725494.2725502","url":null,"abstract":"Evolutionary algorithms (EAs) perform well in settings involving uncertainty, including settings with stochastic or dynamic fitness functions. In this paper, we analyze the (1+1) EA on dynamically changing OneMax, as introduced by Droste (2003). We re-prove the known results on first hitting times using the modern tool of drift analysis. We extend these results to search spaces which allow for more than two values per dimension. Furthermore, we make an anytime analysis as suggested by Jansen and Zarges (2014), analyzing how closely the (1+1) EA can track the dynamically moving optimum over time. We get tight bounds both for the case of bit strings, as well as for the case of more than two values per position. Surprisingly, in the latter setting, the expected quality of the search point maintained by the (1+1) EA does not depend on the number of values per dimension.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121655244","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 53
Insights From Adversarial Fitness Functions 对抗性适应度函数的见解
Alan J. Lockett
{"title":"Insights From Adversarial Fitness Functions","authors":"Alan J. Lockett","doi":"10.1145/2725494.2725501","DOIUrl":"https://doi.org/10.1145/2725494.2725501","url":null,"abstract":"The performance of optimization is usually studied in specific settings where the fitness functions are highly constrained with static, stochastic or dynamic properties. This work examines what happens when the fitness function is a player engaged with the optimizer in an optimization game. Although the advantage of the fitness function is known through the No Free Lunch theorems, several deep insights about the space of possible performance measurements arise as a consequence of studying these adversarial fitness function, including: 1) Every continuous and linear method of measuring performance can be identified with the optimization game for some adversarial fitness; 2) For any convex continuous performance criterion, there is some deterministic optimizer that performs best, even when the fitness function is stochastic or dynamic; 3) Every stochastic optimization method can be viewed as a probabilistic choice over countably many deterministic methods. All of these statements hold in both finite and infinite search domains.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"155 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115581089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Fixed Budget Performance of the (1+1) EA on Linear Functions 线性函数上(1+1)EA的固定预算性能
J. Lengler, N. Spooner
{"title":"Fixed Budget Performance of the (1+1) EA on Linear Functions","authors":"J. Lengler, N. Spooner","doi":"10.1145/2725494.2725506","DOIUrl":"https://doi.org/10.1145/2725494.2725506","url":null,"abstract":"We present a fixed budget analysis of the (1+1) evolutionary algorithm for general linear functions, considering both the quality of the solution after a predetermined 'budget' of fitness function evaluations (a priori) and the improvement in quality when the algorithm is given additional budget, given the quality of the current solution (a posteriori). Two methods are presented: one based on drift analysis, the other on the differential equation method and Chebyshev's inequality. While the first method is superior for general linear functions, the second can be more precise for specific functions and provides concentration guarantees. As an example, we provide tight a posteriori fixed budget results for the function OneMax.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"78 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120938268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 27
Partition Crossover for Pseudo-Boolean Optimization 伪布尔优化的分区交叉
R. Tinós, L. D. Whitley, F. Chicano
{"title":"Partition Crossover for Pseudo-Boolean Optimization","authors":"R. Tinós, L. D. Whitley, F. Chicano","doi":"10.1145/2725494.2725497","DOIUrl":"https://doi.org/10.1145/2725494.2725497","url":null,"abstract":"A partition crossover operator is introduced for use with NK landscapes, MAX-kSAT and for all k-bounded pseudo-Boolean functions. By definition, these problems use a bit representation. Under partition crossover, the evaluation of offspring can be directly obtained from partial evaluations of substrings found in the parents. Partition crossover explores the variable interaction graph of the pseudo-Boolean functions in order to partition the variables of the solution vector. Proofs are presented showing that if the differing variable assignments found in the two parents can be partitioned into q non-interacting sets, partition crossover can be used to find the best of 2q possible offspring. Proofs are presented which show that parents that are locally optimal will always generate offspring that are locally optimal with respect to a (more restricted) hyperplane subspace. Empirical experiments show that parents that are locally optimal generate offspring that are locally optimal in the full search space more than 80 percent of the time. Experimental results also show the effectiveness of the proposed crossover when used in combination with a hybrid genetic algorithm.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131448776","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 68
Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII 2015年ACM遗传算法基础会议论文集13
Jun He, T. Jansen, G. Ochoa, C. Zarges
{"title":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","authors":"Jun He, T. Jansen, G. Ochoa, C. Zarges","doi":"10.1145/2725494","DOIUrl":"https://doi.org/10.1145/2725494","url":null,"abstract":"FOGA, the ACM SIGEVO Workshop on Foundations of Genetic Algorithms, started in 1990 and has, in the past 25 years, established itself as the premier event in the theory of all kinds of randomized search heuristics. Its latest installment, the 13th of its kind, is no exception. \u0000 \u0000FOGA 2015 is special not only because of the quarter of a century anniversary but also because it is the first FOGA to take place in the United Kingdom. Four organizers from all parts of Great Britain joined forces to bring the event to Aberystwyth in Wales. We had 27 participants from seven countries from four continents of the world. They brought with them 16 presentations for accepted papers, carefully selected from 26 submissions. An hour was allocated for each of the presentations to allow ample time to discuss ideas and inspect details. Following the FOGA tradition all papers have undergone another round of reviewing and rewriting after being presented and passionately discussed at the workshop. This ensures that what you find in these postproceedings is the best and best polished current research in the field. \u0000 \u0000The presented papers cover many topics of current research in theory of evolutionary algorithms and other randomized search heuristics. This includes discussion of their limits and potentials, either from the perspective of black-box complexity (Golnaz Badkobeh, Per Kristian Lehre, Dirk Sudholt: Black-box complexity of parallel search with distributed populations; Thomas Jansen: On the black-box complexity of example functions: the real jump function) or from the perspective of adversarial optimization (Alan Lockett: Insights from adversarial fitness functions). A very important aspect of current research are investigations of the performance of specific evolutionary algorithms on specific problems or problem classes. Such work includes further investigations of the very well-known and simple (1+1) evolutionary algorithm (Timo Kotzing, Andrei Lissovoi, Carsten Witt: (1+1) EA on generalized dynamic OneMax; Johannes Lengler, Nick Spooner: Fixed budget performance of the (1+1) EA on linear functions), studies of the performance of evolutionary algorithms when confronted with noisy problems (Duc-Cuong Dang, Per-Kristian Lehre: Efficient optimization of noisy fitness functions with population-based evolutionary algorithms; Adam Prugel-Bennett, Jonathan Rowe, Jonathan Shapiro: Run-time analysis of population-based algorithms in noisy environments; Sandra Astete-Morales, Marie-Liesse Cauwet, Olivier Teytaud: Evolution strategies with additive noise: a convergence rate lower bound), studies of parallel evolutionary algorithms (Eric Scott, Kenneth De Jong: Understanding simple asynchronous evolutionary algorithms; Marie-Liesse Cauwet, Shih-Yuan Chiu, Kuo-Min Lin, David Saint-Pierre, Fabien Teytaud, Olivier Teytaud, Shi-Jim Yen: Parallel evolutionary algorithms performing pairwise comparisons) and studies concerned with improving the performance of evolutionary alg","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132056780","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Black-box Complexity of Parallel Search with Distributed Populations 分布种群并行搜索的黑盒复杂度
Golnaz Badkobeh, P. Lehre, Dirk Sudholt
{"title":"Black-box Complexity of Parallel Search with Distributed Populations","authors":"Golnaz Badkobeh, P. Lehre, Dirk Sudholt","doi":"10.1145/2725494.2725504","DOIUrl":"https://doi.org/10.1145/2725494.2725504","url":null,"abstract":"Many metaheuristics such as island models and cellular evolutionary algorithms use a network of distributed populations that communicate search points along a spatial communication topology. The idea is to slow down the spread of information, reducing the risk of \"premature convergence\", and sacrificing exploitation for an increased exploration. We introduce the distributed black-box complexity as the minimum number of function evaluations every distributed black-box algorithm needs to optimise a given problem. It depends on the topology, the number of populations λ, and the problem size n. We give upper and lower bounds on the distributed black-box complexity for unary unbiased black-box algorithms on a class of unimodal functions in order to study the impact of communication topologies on performance. Our results confirm that rings and torus graphs can lead to higher black-box complexities, compared to unrestricted communication. We further determine cut-off points for the number of populations λ, above which no distributed black-box algorithm can achieve linear speedups.","PeriodicalId":112331,"journal":{"name":"Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116474536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信