Constant optimization and feature standardization in multiobjective genetic programming

IF 1.7 3区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Rockett, Peter
{"title":"Constant optimization and feature standardization in multiobjective genetic programming","authors":"Rockett, Peter","doi":"10.1007/s10710-021-09410-y","DOIUrl":null,"url":null,"abstract":"<p>This paper extends the numerical tuning of tree constants in genetic programming (GP) to the multiobjective domain. Using ten real-world benchmark regression datasets and employing Bayesian comparison procedures, we first consider the effects of feature standardization (without constant tuning) and conclude that standardization generally produces lower test errors, but, contrary to other recently published work, we find much less clear trend for tree sizes. In addition, we consider the effects of constant tuning – with and without feature standardization – and observe that (1) constant tuning invariably improves test error, and (2) usually decreases tree size. Combined with standardization, constant tuning produces the best test error results; tree sizes, however, are increased. We also examine the effects of applying constant tuning only once at the end a conventional GP run which turns out to be surprisingly promising. Finally, we consider the merits of using numerical procedures to tune tree constants and observe that for around half the datasets evolutionary search alone is superior whereas for the remaining half, parameter tuning is superior. We identify a number of open research questions that arise from this work.</p>","PeriodicalId":50424,"journal":{"name":"Genetic Programming and Evolvable Machines","volume":"30 6","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2021-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Genetic Programming and Evolvable Machines","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10710-021-09410-y","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 1

Abstract

This paper extends the numerical tuning of tree constants in genetic programming (GP) to the multiobjective domain. Using ten real-world benchmark regression datasets and employing Bayesian comparison procedures, we first consider the effects of feature standardization (without constant tuning) and conclude that standardization generally produces lower test errors, but, contrary to other recently published work, we find much less clear trend for tree sizes. In addition, we consider the effects of constant tuning – with and without feature standardization – and observe that (1) constant tuning invariably improves test error, and (2) usually decreases tree size. Combined with standardization, constant tuning produces the best test error results; tree sizes, however, are increased. We also examine the effects of applying constant tuning only once at the end a conventional GP run which turns out to be surprisingly promising. Finally, we consider the merits of using numerical procedures to tune tree constants and observe that for around half the datasets evolutionary search alone is superior whereas for the remaining half, parameter tuning is superior. We identify a number of open research questions that arise from this work.

多目标遗传规划中的持续优化与特征标准化
本文将遗传规划中树常数的数值整定推广到多目标领域。使用10个真实世界的基准回归数据集并采用贝叶斯比较程序,我们首先考虑特征标准化的影响(没有不断调整),并得出结论,标准化通常产生较低的测试误差,但是,与其他最近发表的工作相反,我们发现树大小的趋势不太明显。此外,我们考虑了恒定调优的影响——有和没有特征标准化——并观察到:(1)恒定调优总是改善测试误差,(2)通常会减小树的大小。结合标准化,不断调谐产生最佳测试误差结果;然而,树木的大小增加了。我们还研究了在常规GP运行结束时只应用一次恒定调优的效果,结果证明这是非常有希望的。最后,我们考虑了使用数值过程来调整树常数的优点,并观察到对于大约一半的数据集,仅进化搜索是优越的,而对于剩下的一半,参数调整是优越的。我们从这项工作中确定了一些开放的研究问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Genetic Programming and Evolvable Machines
Genetic Programming and Evolvable Machines 工程技术-计算机:理论方法
CiteScore
5.90
自引率
3.80%
发文量
19
审稿时长
6 months
期刊介绍: A unique source reporting on methods for artificial evolution of programs and machines... Reports innovative and significant progress in automatic evolution of software and hardware. Features both theoretical and application papers. Covers hardware implementations, artificial life, molecular computing and emergent computation techniques. Examines such related topics as evolutionary algorithms with variable-size genomes, alternate methods of program induction, approaches to engineering systems development based on embryology, morphogenesis or other techniques inspired by adaptive natural systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信