Proceedings of the Companion Conference on Genetic and Evolutionary Computation最新文献

筛选
英文 中文
Automated algorithm composition of unsupervised image clustering algorithms 自动算法组成的无监督图像聚类算法
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3590555
Mia Gerber, N. Pillay
{"title":"Automated algorithm composition of unsupervised image clustering algorithms","authors":"Mia Gerber, N. Pillay","doi":"10.1145/3583133.3590555","DOIUrl":"https://doi.org/10.1145/3583133.3590555","url":null,"abstract":"Unsupervised learning algorithms are popular as they do not require annotated data. However as per the no-free lunch theorem, the best algorithm to use is not the same for all datasets. This study is the first to automate the composition of an unsupervised image clustering algorithm. This work uses two different techniques to perform automated algorithm composition. The first technique is a genetic algorithm (GA) and the second is a genetic algorithm hyperheuristic (GAHH). A comparison of the two techniques shows that the GA outperforms the GAHH. The GA designs unsupervised clustering algorithms that result in state of the art performance for the Oral lesion, Celebrity faces and COVID-19 datasets.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"109 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133724710","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Hot off the Press: Runtime Analysis for the NSGA-II - Provable Speed-Ups From Crossover 新闻热点:NSGA-II 的运行时间分析--可证明的交叉速度提升
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3595845
Benjamin Doerr, Zhongdi Qu
{"title":"Hot off the Press: Runtime Analysis for the NSGA-II - Provable Speed-Ups From Crossover","authors":"Benjamin Doerr, Zhongdi Qu","doi":"10.1145/3583133.3595845","DOIUrl":"https://doi.org/10.1145/3583133.3595845","url":null,"abstract":"Very recently, the first mathematical runtime analyses for the NSGA-II, the most common multi-objective evolutionary algorithm, have been conducted. Continuing this research direction, we prove that the NSGA-II optimizes the OneJumpZeroJump benchmark asymptotically faster when crossover is employed. Together with a parallel independent work by Dang, Opris, Salehi, and Sudholt (also at AAAI 2023), this is the first time such an advantage of crossover is proven for the NSGA-II. Our arguments can be transferred to single-objective optimization. They then prove that crossover can speed up the (μ + 1) genetic algorithm in a different way and more pronounced than known before. Our experiments confirm the added value of crossover and show that the observed advantages are even larger than what our proofs can guarantee. This paper for the Hot-off-the-Press track at GECCO 2023 summarizes the work Benjamin Doerr, Zhongdi Qu. Runtime analysis for the NSGA-II: Provable speed-ups from crossover, Conference on Artificial Intelligence, AAAI 2023. AAAI Press, to appear. [13].","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"139359112","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Optimising Ferroelectric Thin Films with Evolutionary Computation 用进化计算优化铁电薄膜
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3590750
E. Vissol-Gaudin, Y. Lim, K. Hippalgaonkar
{"title":"Optimising Ferroelectric Thin Films with Evolutionary Computation","authors":"E. Vissol-Gaudin, Y. Lim, K. Hippalgaonkar","doi":"10.1145/3583133.3590750","DOIUrl":"https://doi.org/10.1145/3583133.3590750","url":null,"abstract":"This paper presents the integration of machine learning and image analysis techniques into a material science experimental workflow. The aim is to optimise the properties of an Aluminium Scandium Nitride thin film through the manipulation of experimental input parameters. This is formulated as an optimisation problem, were the search space consists in the set of experimental input parameters used during the film's synthesis. The solution's fitness is obtained through the analysis of Scanning-Electron-Microscopy images and corresponds to the surface defect density over a film. An optimum solution to this problem is defined as the set of input parameters that consistently produces a film with no measurable surface defects. The search space is a black box with possibly more than one optimum and the limited amount of experiments that can be undertaken make efficient exploration challenging. It is shown that classification can be used to reduce the problem's search space by identifying areas of infeasibility. Using nested cross-validation, tree-based classifiers emerge as the most accurate, and importantly, interpretable algorithms for this task. Subsequently, Particle Swarm Optimisation is used to find optimal solutions to the surface defect minimisation problem. Preliminary experimental results show a significant decrease in defect density average achieved.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"245 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115232655","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Diversity Search for the Generation of Diverse Grasping Trajectories 不同抓取轨迹生成的多样性搜索
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3590718
J. Huber, Oumar Sane, Miranda Coninx, F. Ben Amar, S. Doncieux
{"title":"Diversity Search for the Generation of Diverse Grasping Trajectories","authors":"J. Huber, Oumar Sane, Miranda Coninx, F. Ben Amar, S. Doncieux","doi":"10.1145/3583133.3590718","DOIUrl":"https://doi.org/10.1145/3583133.3590718","url":null,"abstract":"Robotic grasping refers to making a robotic system pick an object by applying forces and torques on its surface. Despite the recent advances in data-driven approaches, grasping still needs to be solved. In this work, we consider grasping as a Diversity Search problem, where we attempt to find as many solutions as possible that verify a sparse binary criterion. We propose a variant of a state-of-the-art QD method for grasping based on a divide-and-conquer paradigm to handle grasping discontinuities. Experiments conducted on 3 different robot-gripper setups and several standard objects show that this variant outperforms state-of-the-art for generating diverse repertoires of grasping trajectories.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"125 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115333881","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A Dynamic Partial Update for Covariance Matrix Adaptation 协方差矩阵自适应的动态局部更新
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3596385
Hiroki Shimizu, Masashi Toyoda
{"title":"A Dynamic Partial Update for Covariance Matrix Adaptation","authors":"Hiroki Shimizu, Masashi Toyoda","doi":"10.1145/3583133.3596385","DOIUrl":"https://doi.org/10.1145/3583133.3596385","url":null,"abstract":"Tackling large-scale and ill-conditioned problems is demanding even for the covariance matrix adaptation evolution strategy (CMA-ES), which is a state-of-the-art algorithm for black-box optimization. The coordinate selection is a technique that mitigates the ill-conditionality of large-scale problems by updating parameters in partially selected coordinate spaces. This technique can be applied to various CMA-ES variants and improves their performance especially for ill-conditioned problems. However, it often fails to improve the performance of well-conditioned problems, because it is difficult to choose appropriate coordinate spaces according to the ill-conditionality of problems. We introduce a dynamic partial update method for coordinate selection to solve the above problem. We use the second-order partial derivatives of an objective function to estimate the condition number and select coordinates so that the condition number of each pair does not exceed the given allowable value. In this method, the number of clusters becomes to be small for well-conditioned problems and large for ill-conditioned cases. In particular, the selection does not execute if the condition number of the full space is less than the allowable value. We observe significant improvements in well-conditioned problems and comparable performances in ill-conditioned cases in numerical experiments.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"2011 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114625070","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Extending MOEA/D to Constrained Multi-objective Optimization via Making Constraints an Objective Function 将约束作为目标函数将MOEA/D扩展到约束多目标优化
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3590583
Y. Yasuda, K. Tamura, K. Yasuda
{"title":"Extending MOEA/D to Constrained Multi-objective Optimization via Making Constraints an Objective Function","authors":"Y. Yasuda, K. Tamura, K. Yasuda","doi":"10.1145/3583133.3590583","DOIUrl":"https://doi.org/10.1145/3583133.3590583","url":null,"abstract":"Multiobjective Evolutionary Algorithm based on Decomposition (MOEA/D) is effective for solving multi-objective optimization problems. However, in real-world applications, problems with imposed constraints are common. Therefore, research on Constraint Handling Techniques (CHTs) has been done. CHTs focus on improving search performance by utilizing infeasible solutions. Multi-objective-based CHTs are effective in promoting convergence and diversity in solution sets, but existing CHTs for MOEA/D have limitations in terms of flexibility and extensibility (e.g., the scalarization function to be used). To overcome this, this paper proposes a CHT using two sets of weight vectors to make constraints an objective function. The proposed method is flexible and can be used in any MOEA/D variant. It is incorporated into a basic MOEA/D and its effectiveness is demonstrated by comparing it with existing constrained MOEA/D on 2- and 3-objective benchmark problems.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114660526","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Investigation of Terminal Settings on Multitask Multi-objective Dynamic Flexible Job Shop Scheduling with Genetic Programming 基于遗传规划的多任务多目标动态柔性作业车间调度终端设置研究
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3590546
Fangfang Zhang, Yi Mei, Mengjie Zhang
{"title":"An Investigation of Terminal Settings on Multitask Multi-objective Dynamic Flexible Job Shop Scheduling with Genetic Programming","authors":"Fangfang Zhang, Yi Mei, Mengjie Zhang","doi":"10.1145/3583133.3590546","DOIUrl":"https://doi.org/10.1145/3583133.3590546","url":null,"abstract":"Multitask learning has attracted widespread attention to handle multiple tasks simultaneously. Multitask genetic programming has been successfully used to learn scheduling heuristics for multiple multi-objective dynamic flexible job shop scheduling tasks simultaneously. With genetic programming, the learned scheduling heuristics consist of terminals that are extracted from the features of specific tasks. However, how to set proper terminals with multiple tasks still needs to be investigated. This paper has investigated the effectiveness of three strategies for this purpose, i.e., intersection strategy to use the common terminals between tasks, separation strategy to apply different terminals for different tasks, and union strategy to utilise all the terminals needed for all tasks. The results show that the union strategy which gives tasks the terminals needed by all tasks performs the best. In addition, we find that the learned routing/sequencing rule by the developed algorithm with union strategy in one multitask scenario can share knowledge between each other. On the other hand and more importantly, the learned routing/sequencing rule can also be specific to their tasks with distinguished knowledge represented by genetic materials.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123472994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
EvoPrunerPool: An Evolutionary Pruner using Pruner Pool for Compressing Convolutional Neural Networks EvoPrunerPool:一个使用修剪池来压缩卷积神经网络的进化修剪器
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3596333
Shunmuga Velayutham C., Sujit Subramanian S, A. K, M. Sathya, Nathiyaa Sengodan, Divesh Kosuri, Sai Satvik Arvapalli, Thangavelu S, J. G
{"title":"EvoPrunerPool: An Evolutionary Pruner using Pruner Pool for Compressing Convolutional Neural Networks","authors":"Shunmuga Velayutham C., Sujit Subramanian S, A. K, M. Sathya, Nathiyaa Sengodan, Divesh Kosuri, Sai Satvik Arvapalli, Thangavelu S, J. G","doi":"10.1145/3583133.3596333","DOIUrl":"https://doi.org/10.1145/3583133.3596333","url":null,"abstract":"This paper proposes EvoPrunerPool - an Evolutionary Pruner using Pruner Pool for Compressing Convolutional Neural Networks. EvoPrunerPool formulates filter pruning as a search problem for identifying the right set of pruners from a pool of off-the-shelf filter pruners and applying them in appropriate sequence to incrementally sparsify a given Convolutional Neural Network. The efficacy of EvoPrunerPool has been demonstrated on LeNet model using MNIST data as well as on VGG-19 deep model using CIFAR-10 data and its performance has been benchmarked against state-of-the-art model compression approaches. Experiments demonstrate a very competitive and effective performance of the proposed Evolutionary Pruner. Since EvoPrunerPool employs the native representation of a popular machine learning framework and filter pruners from a well-known AutoML toolkit the proposed approach is both extensible and generic. Consequently, a typical practitioner can use EvoPrunerPool without any in-depth understanding of filter pruning in specific and model compression in general.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"87 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125852540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Does size matter? On the influence of ensemble size on constructing ensembles of dispatching rules 大小重要吗?集成规模对构建调度规则集成的影响
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3590562
Marko Durasevic, F. Gil-Gala, D. Jakobović
{"title":"Does size matter? On the influence of ensemble size on constructing ensembles of dispatching rules","authors":"Marko Durasevic, F. Gil-Gala, D. Jakobović","doi":"10.1145/3583133.3590562","DOIUrl":"https://doi.org/10.1145/3583133.3590562","url":null,"abstract":"Recent years saw an increase in the application of genetic programming (GP) as a hyper-heuristic, i.e., a method used to generate heuristics for solving various combinatorial optimisation problems. One of its widest application is in scheduling to automatically design constructive heuristics called dispatching rules (DRs). DRs are crucial for solving dynamic scheduling environments, in which the conditions change over time. Although automatically designed DRs achieve good results, their performance is limited as a single DR cannot always perform well. Therefore, various methods were used to improve their performance, among which ensemble learning represents one of the most promising directions. Using ensembles introduces several new parameters, such as the ensemble construction method, ensemble collaboration method, and ensemble size. This study investigates the possibility to remove the ensemble size parameter when constructing ensembles. Therefore, the simple ensemble combination method is adapted to randomly select the size of the ensemble it generates, rather than using a fixed ensemble size. Experimental results demonstrate that not using a fixed ensemble size does not result in a worse performance, and that the best ensembles are of smaller sizes. This shows that the ensemble size can be eliminated without a significant influence on the performance.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"87 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126190675","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Evolutionary Pruning of Deep Convolutional Networks by a Memetic GA with Sped-Up Local Optimization and GLCM Energy Z-Score 基于加速局部优化和GLCM能量Z-Score的Memetic GA深度卷积网络进化剪枝
Proceedings of the Companion Conference on Genetic and Evolutionary Computation Pub Date : 2023-07-15 DOI: 10.1145/3583133.3590604
Hana Cho, Han Joon Byun, Min Kee Kim, Joon Huh, Byung-Ro Moon
{"title":"Evolutionary Pruning of Deep Convolutional Networks by a Memetic GA with Sped-Up Local Optimization and GLCM Energy Z-Score","authors":"Hana Cho, Han Joon Byun, Min Kee Kim, Joon Huh, Byung-Ro Moon","doi":"10.1145/3583133.3590604","DOIUrl":"https://doi.org/10.1145/3583133.3590604","url":null,"abstract":"This paper introduces a novel method of selecting the most significant filters in deep neural networks. We performed model simplification via pruning with Genetic Algorithm (GA) for trained deep networks. Pure GA has a weakness of local tuning and slow convergence, so it is not easy to produce good results for problems with large problem space such as ours. We present new ideas that overcome some of GA's weaknesses. These include efficient local optimization, as well as reducing the time of evaluation which occupies most of the running time. Additional time was saved by restricting the filters to preserve using the GLCM (Gray-Level Co-occurrence Matrix) to determine the usefulness of the filters. Ultimately, the saved time was used to perform more iterations, providing the opportunity to further optimize the network. The experimental result showed more than 95% of reduction in forward convolution computation with negligible performance degradation.","PeriodicalId":422029,"journal":{"name":"Proceedings of the Companion Conference on Genetic and Evolutionary Computation","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128548166","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信