{"title":"一种新的基于Lagrange插值算子的黑寡妇优化算法。","authors":"Peiyang Wei, Can Hu, Jingyi Hu, Zhibin Li, Wen Qin, Jianhong Gan, Tinghui Chen, Hongping Shu, Mingsheng Shang","doi":"10.3390/biomimetics10060361","DOIUrl":null,"url":null,"abstract":"<p><p>Hyper-parameters play a critical role in neural networks; they significantly impact both training effectiveness and overall model performance. Proper hyper-parameter settings can accelerate model convergence and improve generalization. Among various hyper-parameters, the learning rate is particularly important. However, optimizing the learning rate typically requires extensive experimentation and tuning, as its setting is often dependent on specific tasks and datasets and therefore lacks universal rules or standards. Consequently, adjustments are generally made through trial and error, thereby making the selection of the learning rate complex and time-consuming. In an attempt to surmount this challenge, evolutionary computation algorithms can automatically adjust the hyper-parameter learning rate to improve training efficiency and model performance. In response to this, we propose a black widow optimization algorithm based on Lagrange interpolation (LIBWONN) to optimize the learning rate of ResNet18. Moreover, we evaluate LIBWONN's effectiveness using 24 benchmark functions from CEC2017 and CEC2022 and compare it with nine advanced metaheuristic algorithms. The experimental results indicate that LIBWONN outperforms the other algorithms in convergence and stability. Additionally, experiments on publicly available datasets from six different fields demonstrate that LIBWONN improves the accuracy on both training and testing sets compared to the standard BWO, with gains of 6.99% and 4.48%, respectively.</p>","PeriodicalId":8907,"journal":{"name":"Biomimetics","volume":"10 6","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2025-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12190972/pdf/","citationCount":"0","resultStr":"{\"title\":\"A Novel Black Widow Optimization Algorithm Based on Lagrange Interpolation Operator for ResNet18.\",\"authors\":\"Peiyang Wei, Can Hu, Jingyi Hu, Zhibin Li, Wen Qin, Jianhong Gan, Tinghui Chen, Hongping Shu, Mingsheng Shang\",\"doi\":\"10.3390/biomimetics10060361\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Hyper-parameters play a critical role in neural networks; they significantly impact both training effectiveness and overall model performance. Proper hyper-parameter settings can accelerate model convergence and improve generalization. Among various hyper-parameters, the learning rate is particularly important. However, optimizing the learning rate typically requires extensive experimentation and tuning, as its setting is often dependent on specific tasks and datasets and therefore lacks universal rules or standards. Consequently, adjustments are generally made through trial and error, thereby making the selection of the learning rate complex and time-consuming. In an attempt to surmount this challenge, evolutionary computation algorithms can automatically adjust the hyper-parameter learning rate to improve training efficiency and model performance. In response to this, we propose a black widow optimization algorithm based on Lagrange interpolation (LIBWONN) to optimize the learning rate of ResNet18. Moreover, we evaluate LIBWONN's effectiveness using 24 benchmark functions from CEC2017 and CEC2022 and compare it with nine advanced metaheuristic algorithms. The experimental results indicate that LIBWONN outperforms the other algorithms in convergence and stability. Additionally, experiments on publicly available datasets from six different fields demonstrate that LIBWONN improves the accuracy on both training and testing sets compared to the standard BWO, with gains of 6.99% and 4.48%, respectively.</p>\",\"PeriodicalId\":8907,\"journal\":{\"name\":\"Biomimetics\",\"volume\":\"10 6\",\"pages\":\"\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-06-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12190972/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Biomimetics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.3390/biomimetics10060361\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomimetics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/biomimetics10060361","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
A Novel Black Widow Optimization Algorithm Based on Lagrange Interpolation Operator for ResNet18.
Hyper-parameters play a critical role in neural networks; they significantly impact both training effectiveness and overall model performance. Proper hyper-parameter settings can accelerate model convergence and improve generalization. Among various hyper-parameters, the learning rate is particularly important. However, optimizing the learning rate typically requires extensive experimentation and tuning, as its setting is often dependent on specific tasks and datasets and therefore lacks universal rules or standards. Consequently, adjustments are generally made through trial and error, thereby making the selection of the learning rate complex and time-consuming. In an attempt to surmount this challenge, evolutionary computation algorithms can automatically adjust the hyper-parameter learning rate to improve training efficiency and model performance. In response to this, we propose a black widow optimization algorithm based on Lagrange interpolation (LIBWONN) to optimize the learning rate of ResNet18. Moreover, we evaluate LIBWONN's effectiveness using 24 benchmark functions from CEC2017 and CEC2022 and compare it with nine advanced metaheuristic algorithms. The experimental results indicate that LIBWONN outperforms the other algorithms in convergence and stability. Additionally, experiments on publicly available datasets from six different fields demonstrate that LIBWONN improves the accuracy on both training and testing sets compared to the standard BWO, with gains of 6.99% and 4.48%, respectively.