Fu Xing Long, Moritz Frenzel, Peter Krause, Markus Gitterle, Thomas Bäck, Niki van Stein
{"title":"使用多输出混合回归和分类的景观感知自动算法配置","authors":"Fu Xing Long, Moritz Frenzel, Peter Krause, Markus Gitterle, Thomas Bäck, Niki van Stein","doi":"arxiv-2409.01446","DOIUrl":null,"url":null,"abstract":"In landscape-aware algorithm selection problem, the effectiveness of\nfeature-based predictive models strongly depends on the representativeness of\ntraining data for practical applications. In this work, we investigate the\npotential of randomly generated functions (RGF) for the model training, which\ncover a much more diverse set of optimization problem classes compared to the\nwidely-used black-box optimization benchmarking (BBOB) suite. Correspondingly,\nwe focus on automated algorithm configuration (AAC), that is, selecting the\nbest suited algorithm and fine-tuning its hyperparameters based on the\nlandscape features of problem instances. Precisely, we analyze the performance\nof dense neural network (NN) models in handling the multi-output mixed\nregression and classification tasks using different training data sets, such as\nRGF and many-affine BBOB (MA-BBOB) functions. Based on our results on the BBOB\nfunctions in 5d and 20d, near optimal configurations can be identified using\nthe proposed approach, which can most of the time outperform the off-the-shelf\ndefault configuration considered by practitioners with limited knowledge about\nAAC. Furthermore, the predicted configurations are competitive against the\nsingle best solver in many cases. Overall, configurations with better\nperformance can be best identified by using NN models trained on a combination\nof RGF and MA-BBOB functions.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"68 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Landscape-Aware Automated Algorithm Configuration using Multi-output Mixed Regression and Classification\",\"authors\":\"Fu Xing Long, Moritz Frenzel, Peter Krause, Markus Gitterle, Thomas Bäck, Niki van Stein\",\"doi\":\"arxiv-2409.01446\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In landscape-aware algorithm selection problem, the effectiveness of\\nfeature-based predictive models strongly depends on the representativeness of\\ntraining data for practical applications. In this work, we investigate the\\npotential of randomly generated functions (RGF) for the model training, which\\ncover a much more diverse set of optimization problem classes compared to the\\nwidely-used black-box optimization benchmarking (BBOB) suite. Correspondingly,\\nwe focus on automated algorithm configuration (AAC), that is, selecting the\\nbest suited algorithm and fine-tuning its hyperparameters based on the\\nlandscape features of problem instances. Precisely, we analyze the performance\\nof dense neural network (NN) models in handling the multi-output mixed\\nregression and classification tasks using different training data sets, such as\\nRGF and many-affine BBOB (MA-BBOB) functions. Based on our results on the BBOB\\nfunctions in 5d and 20d, near optimal configurations can be identified using\\nthe proposed approach, which can most of the time outperform the off-the-shelf\\ndefault configuration considered by practitioners with limited knowledge about\\nAAC. Furthermore, the predicted configurations are competitive against the\\nsingle best solver in many cases. Overall, configurations with better\\nperformance can be best identified by using NN models trained on a combination\\nof RGF and MA-BBOB functions.\",\"PeriodicalId\":501347,\"journal\":{\"name\":\"arXiv - CS - Neural and Evolutionary Computing\",\"volume\":\"68 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Neural and Evolutionary Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.01446\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.01446","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Landscape-Aware Automated Algorithm Configuration using Multi-output Mixed Regression and Classification
In landscape-aware algorithm selection problem, the effectiveness of
feature-based predictive models strongly depends on the representativeness of
training data for practical applications. In this work, we investigate the
potential of randomly generated functions (RGF) for the model training, which
cover a much more diverse set of optimization problem classes compared to the
widely-used black-box optimization benchmarking (BBOB) suite. Correspondingly,
we focus on automated algorithm configuration (AAC), that is, selecting the
best suited algorithm and fine-tuning its hyperparameters based on the
landscape features of problem instances. Precisely, we analyze the performance
of dense neural network (NN) models in handling the multi-output mixed
regression and classification tasks using different training data sets, such as
RGF and many-affine BBOB (MA-BBOB) functions. Based on our results on the BBOB
functions in 5d and 20d, near optimal configurations can be identified using
the proposed approach, which can most of the time outperform the off-the-shelf
default configuration considered by practitioners with limited knowledge about
AAC. Furthermore, the predicted configurations are competitive against the
single best solver in many cases. Overall, configurations with better
performance can be best identified by using NN models trained on a combination
of RGF and MA-BBOB functions.