Lexicase-based Selection Methods with Down-sampling for Symbolic Regression Problems: Overview and Benchmark

Alina Geiger, Dominik Sobania, Franz Rothlauf
{"title":"Lexicase-based Selection Methods with Down-sampling for Symbolic Regression Problems: Overview and Benchmark","authors":"Alina Geiger, Dominik Sobania, Franz Rothlauf","doi":"arxiv-2407.21632","DOIUrl":null,"url":null,"abstract":"In recent years, several new lexicase-based selection variants have emerged\ndue to the success of standard lexicase selection in various application\ndomains. For symbolic regression problems, variants that use an\nepsilon-threshold or batches of training cases, among others, have led to\nperformance improvements. Lately, especially variants that combine lexicase\nselection and down-sampling strategies have received a lot of attention. This\npaper evaluates random as well as informed down-sampling in combination with\nthe relevant lexicase-based selection methods on a wide range of symbolic\nregression problems. In contrast to most work, we not only compare the methods\nover a given evaluation budget, but also over a given time as time is usually\nlimited in practice. We find that for a given evaluation budget,\nepsilon-lexicase selection in combination with random or informed down-sampling\noutperforms all other methods. Only for a rather long running time of 24h, the\nbest performing method is tournament selection in combination with informed\ndown-sampling. If the given running time is very short, lexicase variants using\nbatches of training cases perform best.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.21632","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, several new lexicase-based selection variants have emerged due to the success of standard lexicase selection in various application domains. For symbolic regression problems, variants that use an epsilon-threshold or batches of training cases, among others, have led to performance improvements. Lately, especially variants that combine lexicase selection and down-sampling strategies have received a lot of attention. This paper evaluates random as well as informed down-sampling in combination with the relevant lexicase-based selection methods on a wide range of symbolic regression problems. In contrast to most work, we not only compare the methods over a given evaluation budget, but also over a given time as time is usually limited in practice. We find that for a given evaluation budget, epsilon-lexicase selection in combination with random or informed down-sampling outperforms all other methods. Only for a rather long running time of 24h, the best performing method is tournament selection in combination with informed down-sampling. If the given running time is very short, lexicase variants using batches of training cases perform best.
基于词典的选择方法与符号回归问题的向下采样:概述与基准
近年来,由于标准lexicase选择在不同应用领域的成功,出现了几种新的基于lexicase选择的变体。对于符号回归问题,使用epsilon阈值或成批训练案例等的变体提高了性能。最近,结合词典选择和向下抽样策略的变体受到了广泛关注。本文结合相关的基于词法的选择方法,对各种符号回归问题进行了随机和知情向下采样的评估。与大多数工作不同的是,我们不仅在给定的评估预算内比较了这些方法,而且还在给定的时间内比较了这些方法,因为在实践中时间通常是有限的。我们发现,在给定的评估预算下,ε-lexicase 选择与随机或知情向下采样相结合的方法优于所有其他方法。只有在运行时间相当长(24 小时)的情况下,表现最好的方法才是锦标赛选择与知情向下抽样相结合的方法。如果给定的运行时间很短,则使用训练案例批次的词法变体表现最佳。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信