一种新的基于Lagrange插值算子的黑寡妇优化算法。

IF 3.9 3区 医学 Q1 ENGINEERING, MULTIDISCIPLINARY
Peiyang Wei, Can Hu, Jingyi Hu, Zhibin Li, Wen Qin, Jianhong Gan, Tinghui Chen, Hongping Shu, Mingsheng Shang
{"title":"一种新的基于Lagrange插值算子的黑寡妇优化算法。","authors":"Peiyang Wei, Can Hu, Jingyi Hu, Zhibin Li, Wen Qin, Jianhong Gan, Tinghui Chen, Hongping Shu, Mingsheng Shang","doi":"10.3390/biomimetics10060361","DOIUrl":null,"url":null,"abstract":"<p><p>Hyper-parameters play a critical role in neural networks; they significantly impact both training effectiveness and overall model performance. Proper hyper-parameter settings can accelerate model convergence and improve generalization. Among various hyper-parameters, the learning rate is particularly important. However, optimizing the learning rate typically requires extensive experimentation and tuning, as its setting is often dependent on specific tasks and datasets and therefore lacks universal rules or standards. Consequently, adjustments are generally made through trial and error, thereby making the selection of the learning rate complex and time-consuming. In an attempt to surmount this challenge, evolutionary computation algorithms can automatically adjust the hyper-parameter learning rate to improve training efficiency and model performance. In response to this, we propose a black widow optimization algorithm based on Lagrange interpolation (LIBWONN) to optimize the learning rate of ResNet18. Moreover, we evaluate LIBWONN's effectiveness using 24 benchmark functions from CEC2017 and CEC2022 and compare it with nine advanced metaheuristic algorithms. The experimental results indicate that LIBWONN outperforms the other algorithms in convergence and stability. Additionally, experiments on publicly available datasets from six different fields demonstrate that LIBWONN improves the accuracy on both training and testing sets compared to the standard BWO, with gains of 6.99% and 4.48%, respectively.</p>","PeriodicalId":8907,"journal":{"name":"Biomimetics","volume":"10 6","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2025-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12190972/pdf/","citationCount":"0","resultStr":"{\"title\":\"A Novel Black Widow Optimization Algorithm Based on Lagrange Interpolation Operator for ResNet18.\",\"authors\":\"Peiyang Wei, Can Hu, Jingyi Hu, Zhibin Li, Wen Qin, Jianhong Gan, Tinghui Chen, Hongping Shu, Mingsheng Shang\",\"doi\":\"10.3390/biomimetics10060361\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Hyper-parameters play a critical role in neural networks; they significantly impact both training effectiveness and overall model performance. Proper hyper-parameter settings can accelerate model convergence and improve generalization. Among various hyper-parameters, the learning rate is particularly important. However, optimizing the learning rate typically requires extensive experimentation and tuning, as its setting is often dependent on specific tasks and datasets and therefore lacks universal rules or standards. Consequently, adjustments are generally made through trial and error, thereby making the selection of the learning rate complex and time-consuming. In an attempt to surmount this challenge, evolutionary computation algorithms can automatically adjust the hyper-parameter learning rate to improve training efficiency and model performance. In response to this, we propose a black widow optimization algorithm based on Lagrange interpolation (LIBWONN) to optimize the learning rate of ResNet18. Moreover, we evaluate LIBWONN's effectiveness using 24 benchmark functions from CEC2017 and CEC2022 and compare it with nine advanced metaheuristic algorithms. The experimental results indicate that LIBWONN outperforms the other algorithms in convergence and stability. Additionally, experiments on publicly available datasets from six different fields demonstrate that LIBWONN improves the accuracy on both training and testing sets compared to the standard BWO, with gains of 6.99% and 4.48%, respectively.</p>\",\"PeriodicalId\":8907,\"journal\":{\"name\":\"Biomimetics\",\"volume\":\"10 6\",\"pages\":\"\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-06-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12190972/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Biomimetics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.3390/biomimetics10060361\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomimetics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/biomimetics10060361","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

超参数在神经网络中起着关键作用;它们显著地影响了训练效果和整体模型性能。适当的超参数设置可以加快模型的收敛速度,提高模型的泛化能力。在各种超参数中,学习率尤为重要。然而,优化学习率通常需要大量的实验和调整,因为它的设置通常依赖于特定的任务和数据集,因此缺乏通用的规则或标准。因此,调整通常是通过试错来进行的,从而使学习率的选择变得复杂和耗时。为了克服这一挑战,进化计算算法可以自动调整超参数学习率,以提高训练效率和模型性能。针对这一点,我们提出了一种基于拉格朗日插值的黑寡妇优化算法(LIBWONN)来优化ResNet18的学习率。此外,我们使用来自CEC2017和CEC2022的24个基准函数来评估LIBWONN的有效性,并将其与九种高级元启发式算法进行比较。实验结果表明,LIBWONN算法在收敛性和稳定性方面优于其他算法。此外,在六个不同领域的公开数据集上进行的实验表明,与标准BWO相比,LIBWONN在训练集和测试集上的准确率分别提高了6.99%和4.48%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Novel Black Widow Optimization Algorithm Based on Lagrange Interpolation Operator for ResNet18.

Hyper-parameters play a critical role in neural networks; they significantly impact both training effectiveness and overall model performance. Proper hyper-parameter settings can accelerate model convergence and improve generalization. Among various hyper-parameters, the learning rate is particularly important. However, optimizing the learning rate typically requires extensive experimentation and tuning, as its setting is often dependent on specific tasks and datasets and therefore lacks universal rules or standards. Consequently, adjustments are generally made through trial and error, thereby making the selection of the learning rate complex and time-consuming. In an attempt to surmount this challenge, evolutionary computation algorithms can automatically adjust the hyper-parameter learning rate to improve training efficiency and model performance. In response to this, we propose a black widow optimization algorithm based on Lagrange interpolation (LIBWONN) to optimize the learning rate of ResNet18. Moreover, we evaluate LIBWONN's effectiveness using 24 benchmark functions from CEC2017 and CEC2022 and compare it with nine advanced metaheuristic algorithms. The experimental results indicate that LIBWONN outperforms the other algorithms in convergence and stability. Additionally, experiments on publicly available datasets from six different fields demonstrate that LIBWONN improves the accuracy on both training and testing sets compared to the standard BWO, with gains of 6.99% and 4.48%, respectively.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Biomimetics
Biomimetics Biochemistry, Genetics and Molecular Biology-Biotechnology
CiteScore
3.50
自引率
11.10%
发文量
189
审稿时长
11 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信