Francesco Zito , El-Ghazali Talbi , Claudia Cavallaro , Vincenzo Cutello , Mario Pavone
{"title":"Metaheuristics in automated machine learning: Strategies for optimization","authors":"Francesco Zito , El-Ghazali Talbi , Claudia Cavallaro , Vincenzo Cutello , Mario Pavone","doi":"10.1016/j.iswa.2025.200532","DOIUrl":null,"url":null,"abstract":"<div><div>The present work explores the application of Automated Machine Learning techniques, particularly on the optimization of Artificial Neural Networks through hyperparameter tuning. Artificial Neural Networks are widely used across various fields, however building and optimizing them presents significant challenges. By employing an effective hyperparameter tuning, shallow neural networks might become competitive with their deeper counterparts, which in turn makes them more suitable for low-power consumption applications. In our work, we highlight the importance of Hyperparameter Optimization in enhancing neural network performance. We examine various metaheuristic algorithms employed and, in particular, their effectiveness in improving model performance across diverse applications. Despite significant advancements in this area, a comprehensive comparison of these algorithms across different deep learning architectures remains lacking. This work aims to fill this gap by systematically evaluating the performance of metaheuristic algorithms in optimizing hyperparameters and discussing advanced techniques such as parallel computing to adapt metaheuristic algorithms for use in hyperparameter optimization with high-dimensional hyperparameter search space.</div></div>","PeriodicalId":100684,"journal":{"name":"Intelligent Systems with Applications","volume":"26 ","pages":"Article 200532"},"PeriodicalIF":0.0000,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligent Systems with Applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2667305325000584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The present work explores the application of Automated Machine Learning techniques, particularly on the optimization of Artificial Neural Networks through hyperparameter tuning. Artificial Neural Networks are widely used across various fields, however building and optimizing them presents significant challenges. By employing an effective hyperparameter tuning, shallow neural networks might become competitive with their deeper counterparts, which in turn makes them more suitable for low-power consumption applications. In our work, we highlight the importance of Hyperparameter Optimization in enhancing neural network performance. We examine various metaheuristic algorithms employed and, in particular, their effectiveness in improving model performance across diverse applications. Despite significant advancements in this area, a comprehensive comparison of these algorithms across different deep learning architectures remains lacking. This work aims to fill this gap by systematically evaluating the performance of metaheuristic algorithms in optimizing hyperparameters and discussing advanced techniques such as parallel computing to adapt metaheuristic algorithms for use in hyperparameter optimization with high-dimensional hyperparameter search space.