{"title":"Anaconda defeats Hoyle 6-0: a case study competing an evolved checkers program against commercially available software","authors":"K. Chellapilla, D. Fogel","doi":"10.1109/CEC.2000.870729","DOIUrl":"https://doi.org/10.1109/CEC.2000.870729","url":null,"abstract":"We have been exploring the potential for a coevolutionary process to learn how to play checkers without relying on the usual inclusion of human expertise in the form of features that are believed to be important to playing well. In particular, we have focused on the use of a population of neural networks, where each network serves as an evaluation function to describe the quality of the current board position. After only a little more than 800 generations, the evolutionary process has generated a neural network that can play checkers at the expert level as designated by the US Chess Federation rating system. The current effort reports on a competition between the best-evolved neural network, named \"Anaconda,\" and commercially available software. In a series of six games, Anaconda scored a perfect six wins.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114786144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using multiple representations in evolutionary algorithms","authors":"T. Schnier, X. Yao","doi":"10.1109/CEC.2000.870335","DOIUrl":"https://doi.org/10.1109/CEC.2000.870335","url":null,"abstract":"Although evolutionary algorithms are very different from other artificial intelligence search algorithms, they face similar fundamental issues-representation and searching. There has been a large amount of work done in evolutionary computation on searching, such as recombination operators, mutation operators, selection schemes and various specialised operators. In comparison, research on different representations has not been as active. Most such research has been focused on a single representation, e.g. bit strings, real-valued vectors using Cartesian coordinates, etc. This paper proposes and studies multiple representations in an evolutionary algorithm and shows empirically how multiple representations can benefit searches as much as a good search operator could.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122013180","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Expressing evolutionary computation, genetic programming, artificial life, autonomous agents and DNA-based computing in -calculus-revised version","authors":"E. Eberbach","doi":"10.1109/CEC.2000.870810","DOIUrl":"https://doi.org/10.1109/CEC.2000.870810","url":null,"abstract":"Genetic programming, autonomous agents, artificial life and evolutionary computation share many common ideas. They generally investigate distributed complex processes, perhaps with the ability to interact. It seems to be natural to study their behavior using process algebras, which were designed to handle distributed interactive systems. -calculus is a higher-order polyadic process algebra for resource bounded computation. It has been designed to handle autonomous agents, evolutionary computing, neural nets, expert systems, machine learning, and distributed interactive AI systems, in general. -calculus has a built-in cost-optimization mechanism allowing to deal with nondeterminism, incomplete and uncertain information. We express in -calculus several subareas of evolutionary computation, including genetic programming, artificial life, autonomous agents and DNA-based computing.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132115609","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Solving constraint satisfaction problems with heuristic-based evolutionary algorithms","authors":"B. Craenen, A. Eiben, E. Marchiori","doi":"10.1109/CEC.2000.870843","DOIUrl":"https://doi.org/10.1109/CEC.2000.870843","url":null,"abstract":"Evolutionary algorithms (EAs) for solving constraint satisfaction problems (CSPs) can be roughly divided into two classes: EAs with adaptive fitness functions and heuristic-based EAs. A.E. Eiben et al. (1998) compared effective EAs of the first class experimentally using a large set of benchmark instances consisting of randomly-generated binary CSPs. In this paper, we complete this comparison by performing the same experiments using three of the most effective heuristic-based EAs. The results of our experiments indicate that the three heuristic-based EAs have similar performances on random binary CSPs. Comparing these results with those of A.E. Eiben et al., we are able to identify the best EA for binary CSPs as the algorithm introduced by G. Dozier et al. (1994), which uses a heuristic as well as an adaptive fitness function.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"86 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131586180","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"GA-based kernel optimization for pattern recognition: theory for EHW application","authors":"M. Yasunaga, T. Nakamura, I. Yoshihara, J. Kim","doi":"10.1109/CEC.2000.870344","DOIUrl":"https://doi.org/10.1109/CEC.2000.870344","url":null,"abstract":"An extension of the kernel-based pattern recognition method using a genetic algorithm is proposed. The method is suited to evolvable pattern recognition hardware using FPGAs. In the conventional method one common kernel function is used in the superposition to make discrimination functions. In the extended method each region of the kernel function is optimized individually. For the kernel-region optimization we use a genetic algorithm to solve a large combinatorial problem almost impossible to solve using any brute-force search. A chromosome represents the kernel region in an n-dimensional pattern space, and each locus corresponds to one of the candidates (genes) for an edge length of the kernel region. We have applied the extended method to a sonar spectrum recognition problem and obtained a recognition accuracy of 83.9%, which is much higher than the 62.0% obtained using the conventional kernel-based method and is also better than 82.7% obtained using the nearest neighbor method and the 83.0% obtained using a neural network (backpropagation algorithm). We have analyzed the individually optimized kernel regions and shown that the GA process automatically extracts features in the patterns and embeds the features in the kernel regions.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130867438","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Revisiting Bremermann's genetic algorithm. I. Simultaneous mutation of all parameters","authors":"D. Fogel, R. Anderson","doi":"10.1109/CEC.2000.870787","DOIUrl":"https://doi.org/10.1109/CEC.2000.870787","url":null,"abstract":"Hans Bremermann was one of the pioneers of evolutionary computation. Many of his early suggestions for designing evolutionary algorithms anticipated future inventions, including scaling mutations to be inversely proportional to the number of parameters in the problem, as well as many forms of recombination. This paper explores the gain in performance that occurs when Bremermann's original evolutionary algorithm (H.J. Bremermann et al., 1966) is extended to include the simultaneous mutation of every component in a candidate solution. Bremermann's original perspective was entirely \"genetic\", where each component corresponded to a gene, and therefore multiple simultaneous changes were viewed as occurring with geometrically decreasing probability. Experiments indicate that a change in perspective to a \"phenotypic\" view, where all the components change at once, can lead to more rapid optimization on linear systems of equations.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133559367","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Genetic estimation of competitive agents behavior","authors":"A. Florea","doi":"10.1109/CEC.2000.870326","DOIUrl":"https://doi.org/10.1109/CEC.2000.870326","url":null,"abstract":"The paper presents a multi-agent system that tries to solve the problem of rational exploitation of natural renewable resources by self-interested agents. The particular problem instance is that of several fishing companies trying to catch fish in several fishing banks. The system comprises a set of cognitive agents that represent the companies, are able to build plans and will interact with other companies to ensure the ecological balance of the resources and the achievement of their own plans. The environment is represented by a particular agent with which the companies interact. The agents are using both a symbolic representation and a genetic representation to build plans, to model the evolution of the unpredictable world in which they live and to conduct negotiation. The genetic approach is based on cooperative coadapted species and is used to model the multi-agent world from the point of view of a particular agent.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"138 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133672624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimization is easy and learning is hard in the typical function","authors":"T. M. English","doi":"10.1109/CEC.2000.870741","DOIUrl":"https://doi.org/10.1109/CEC.2000.870741","url":null,"abstract":"Elementary results in algorithmic information theory are invoked to show that almost all finite functions are highly random. That is, the shortest program generating a given function description is rarely much shorter than the description. It is also shown that the length of a program for learning or optimization poses a bound on the algorithmic information it supplies about any description. For highly random descriptions, success in guessing values is essentially accidental, but learning accuracy can be high in some cases if the program is long. Optimizers, on the other hand, are graded according to the goodness of values in partial functions they sample. In a highly random function, good values are as common and evenly dispersed as bad values, and random sampling of points is very efficient.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"22 10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128818619","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Evolutionary optimised ontogenetic neural networks with incremental problem complexity during development","authors":"B. Sendhoff","doi":"10.1109/CEC.2000.870823","DOIUrl":"https://doi.org/10.1109/CEC.2000.870823","url":null,"abstract":"In order to optimise unconstrained, large neural network structures with evolutionary algorithms, indirect encodings have been proposed. However, if the evolutionary process is combined with network learning, which is sensible both with respect to technical applications in dynamical environments and to the biological paragon, a way has to be found to combine learning with the evolutionary optimisation of such large structures. Utilising the development of neural systems during ontogeny seems a logical starting point for the realization of a step by step learning in networks. Furthermore, the combination of network growth during the developmental phase with an incremental problem complexity might allow the optimisation of large network structures together with learning. The author proposes a model to simulate such a combined approach and applies it to the problem of time series modelling. By introducing several measures for the transfer of information from one developmental step to the next, we will be able to quantitatively analyse the behaviour of the proposed model.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115402665","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Why the schema theorem is correct also in the presence of stochastic effects","authors":"R. Poli","doi":"10.1109/CEC.2000.870336","DOIUrl":"https://doi.org/10.1109/CEC.2000.870336","url":null,"abstract":"J. Holland's (1975) schema theorem has been criticised by D.B. Fogel and A. Ghozeil (1997, 1998, 1999) for not being able to correctly estimate the expected proportion of a schema in the population when fitness-proportionate selection is used in the presence of noise or other stochastic effects. This is incorrect for two reasons. Firstly, the theorem in its original form is not applicable to this case. If the quantities involved in schema theorems are random variables, the theorems must be interpreted as conditional statements. Secondly, the conditional versions of Holland and other researchers' schema theorems are indeed very useful to model the sampling of schemata in the presence of stochasticity. In this paper, I show how one can calculate the correct expected proportion of a schema in the presence of stochastic effects when only selection is present, using a conditional interpretation of Holland's schema theorem. In addition, I generalise this result (again using schema theorems) to the case in which crossover, mutation and selection-with-replacement are used. This can be considered as an exact schema theorem that is applicable both in the presence and in the absence of stochastic effects.","PeriodicalId":218136,"journal":{"name":"Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512)","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-07-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115413135","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}