Metaheuristics in automated machine learning: Strategies for optimization

Francesco Zito , El-Ghazali Talbi , Claudia Cavallaro , Vincenzo Cutello , Mario Pavone
{"title":"Metaheuristics in automated machine learning: Strategies for optimization","authors":"Francesco Zito ,&nbsp;El-Ghazali Talbi ,&nbsp;Claudia Cavallaro ,&nbsp;Vincenzo Cutello ,&nbsp;Mario Pavone","doi":"10.1016/j.iswa.2025.200532","DOIUrl":null,"url":null,"abstract":"<div><div>The present work explores the application of Automated Machine Learning techniques, particularly on the optimization of Artificial Neural Networks through hyperparameter tuning. Artificial Neural Networks are widely used across various fields, however building and optimizing them presents significant challenges. By employing an effective hyperparameter tuning, shallow neural networks might become competitive with their deeper counterparts, which in turn makes them more suitable for low-power consumption applications. In our work, we highlight the importance of Hyperparameter Optimization in enhancing neural network performance. We examine various metaheuristic algorithms employed and, in particular, their effectiveness in improving model performance across diverse applications. Despite significant advancements in this area, a comprehensive comparison of these algorithms across different deep learning architectures remains lacking. This work aims to fill this gap by systematically evaluating the performance of metaheuristic algorithms in optimizing hyperparameters and discussing advanced techniques such as parallel computing to adapt metaheuristic algorithms for use in hyperparameter optimization with high-dimensional hyperparameter search space.</div></div>","PeriodicalId":100684,"journal":{"name":"Intelligent Systems with Applications","volume":"26 ","pages":"Article 200532"},"PeriodicalIF":0.0000,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intelligent Systems with Applications","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2667305325000584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The present work explores the application of Automated Machine Learning techniques, particularly on the optimization of Artificial Neural Networks through hyperparameter tuning. Artificial Neural Networks are widely used across various fields, however building and optimizing them presents significant challenges. By employing an effective hyperparameter tuning, shallow neural networks might become competitive with their deeper counterparts, which in turn makes them more suitable for low-power consumption applications. In our work, we highlight the importance of Hyperparameter Optimization in enhancing neural network performance. We examine various metaheuristic algorithms employed and, in particular, their effectiveness in improving model performance across diverse applications. Despite significant advancements in this area, a comprehensive comparison of these algorithms across different deep learning architectures remains lacking. This work aims to fill this gap by systematically evaluating the performance of metaheuristic algorithms in optimizing hyperparameters and discussing advanced techniques such as parallel computing to adapt metaheuristic algorithms for use in hyperparameter optimization with high-dimensional hyperparameter search space.
自动机器学习中的元启发式:优化策略
目前的工作探讨了自动化机器学习技术的应用,特别是通过超参数调谐对人工神经网络进行优化。人工神经网络广泛应用于各个领域,但构建和优化人工神经网络面临着重大挑战。通过采用有效的超参数调整,浅层神经网络可能会与深层神经网络竞争,这反过来又使它们更适合低功耗应用。在我们的工作中,我们强调了超参数优化在提高神经网络性能方面的重要性。我们研究了所采用的各种元启发式算法,特别是它们在提高不同应用程序的模型性能方面的有效性。尽管在这一领域取得了重大进展,但在不同深度学习架构中对这些算法的全面比较仍然缺乏。本工作旨在通过系统地评估元启发式算法在优化超参数方面的性能,并讨论诸如并行计算等先进技术,以使元启发式算法适用于具有高维超参数搜索空间的超参数优化,从而填补这一空白。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.60
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信