基于两种进化计算的图像分类混合算法。

IF 3.9 3区 医学 Q1 ENGINEERING, MULTIDISCIPLINARY
Peiyang Wei, Rundong Zou, Jianhong Gan, Zhibin Li
{"title":"基于两种进化计算的图像分类混合算法。","authors":"Peiyang Wei, Rundong Zou, Jianhong Gan, Zhibin Li","doi":"10.3390/biomimetics10080544","DOIUrl":null,"url":null,"abstract":"<p><p>Convolutional neural networks (CNNs) and their improved models (like DenseNet-121) have achieved significant results in image classification tasks. However, the performance of these models is still constrained by issues such as hyperparameter optimization and gradient vanishing and exploding. Owing to their unique exploration and exploitation capabilities, evolutionary algorithms offer new avenues for addressing these problems. Simultaneously, to prevent these algorithms from falling into a local optimum during the search process, this study designs a novel interpolation algorithm. To achieve better image classification performance, thus enhancing classification accuracy and boosting model stability, this paper utilizes a hybrid algorithm based on the horned lizard algorithm with quadratic interpolation and the giant armadillo optimization with Newton interpolation (HGAO) to optimize the hyperparameters of DenseNet-121. It is applied to five datasets spanning different domains. The learning rate and dropout rate have notable impacts on the outcomes of the DenseNet-121 model, which are chosen as the hyperparameters to be optimized. Experiments are conducted using the HGAO algorithm on five image datasets and compared with nine state-of-the-art algorithms. The performance of the model is evaluated based on accuracy, precision, recall, and F1-score metrics. The experimental results reveal that the combination of hyperparameters becomes more reasonable after optimization with the HGAO algorithm, thus providing a crucial improvement. In the comparative experiments, the accuracy of the image classification on the training set increased by up to 0.5%, with a maximum reduction in loss of 0.018. On the test set, the accuracy rose by 0.5%, and the loss decreased by 54 points. The HGAO algorithm provides an effective solution for optimizing the DenseNet-121 model. The designed method boosts classification accuracy and model stability, which also dramatically augments hyperparameter optimization effects and resolves gradient difficulties.</p>","PeriodicalId":8907,"journal":{"name":"Biomimetics","volume":"10 8","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2025-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12383529/pdf/","citationCount":"0","resultStr":"{\"title\":\"Hybrid Algorithms Based on Two Evolutionary Computations for Image Classification.\",\"authors\":\"Peiyang Wei, Rundong Zou, Jianhong Gan, Zhibin Li\",\"doi\":\"10.3390/biomimetics10080544\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Convolutional neural networks (CNNs) and their improved models (like DenseNet-121) have achieved significant results in image classification tasks. However, the performance of these models is still constrained by issues such as hyperparameter optimization and gradient vanishing and exploding. Owing to their unique exploration and exploitation capabilities, evolutionary algorithms offer new avenues for addressing these problems. Simultaneously, to prevent these algorithms from falling into a local optimum during the search process, this study designs a novel interpolation algorithm. To achieve better image classification performance, thus enhancing classification accuracy and boosting model stability, this paper utilizes a hybrid algorithm based on the horned lizard algorithm with quadratic interpolation and the giant armadillo optimization with Newton interpolation (HGAO) to optimize the hyperparameters of DenseNet-121. It is applied to five datasets spanning different domains. The learning rate and dropout rate have notable impacts on the outcomes of the DenseNet-121 model, which are chosen as the hyperparameters to be optimized. Experiments are conducted using the HGAO algorithm on five image datasets and compared with nine state-of-the-art algorithms. The performance of the model is evaluated based on accuracy, precision, recall, and F1-score metrics. The experimental results reveal that the combination of hyperparameters becomes more reasonable after optimization with the HGAO algorithm, thus providing a crucial improvement. In the comparative experiments, the accuracy of the image classification on the training set increased by up to 0.5%, with a maximum reduction in loss of 0.018. On the test set, the accuracy rose by 0.5%, and the loss decreased by 54 points. The HGAO algorithm provides an effective solution for optimizing the DenseNet-121 model. The designed method boosts classification accuracy and model stability, which also dramatically augments hyperparameter optimization effects and resolves gradient difficulties.</p>\",\"PeriodicalId\":8907,\"journal\":{\"name\":\"Biomimetics\",\"volume\":\"10 8\",\"pages\":\"\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-08-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12383529/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Biomimetics\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.3390/biomimetics10080544\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomimetics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/biomimetics10080544","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

卷积神经网络(cnn)及其改进模型(如DenseNet-121)在图像分类任务中取得了显著的成果。然而,这些模型的性能仍然受到超参数优化和梯度消失和爆炸等问题的制约。由于其独特的探索和开发能力,进化算法为解决这些问题提供了新的途径。同时,为了防止这些算法在搜索过程中陷入局部最优,本研究设计了一种新的插值算法。为了获得更好的图像分类性能,从而提高分类精度和增强模型稳定性,本文采用基于二次插值的角蜥蜴算法和基于牛顿插值的巨型犰狳优化(HGAO)的混合算法对DenseNet-121的超参数进行优化。它被应用于跨越不同领域的五个数据集。学习率和辍学率对DenseNet-121模型的结果有显著影响,并将其作为需要优化的超参数。利用HGAO算法在5个图像数据集上进行了实验,并与9种最新算法进行了比较。模型的性能基于准确性、精密度、召回率和f1评分指标进行评估。实验结果表明,经过HGAO算法优化后,超参数组合变得更加合理,从而提供了至关重要的改进。在对比实验中,在训练集上的图像分类准确率提高了0.5%,最大减少了0.018的损失。在测试集上,准确率提高了0.5%,损失降低了54分。HGAO算法为DenseNet-121模型的优化提供了一种有效的解决方案。该方法提高了分类精度和模型稳定性,显著增强了超参数优化效果,解决了梯度难题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Hybrid Algorithms Based on Two Evolutionary Computations for Image Classification.

Hybrid Algorithms Based on Two Evolutionary Computations for Image Classification.

Hybrid Algorithms Based on Two Evolutionary Computations for Image Classification.

Hybrid Algorithms Based on Two Evolutionary Computations for Image Classification.

Convolutional neural networks (CNNs) and their improved models (like DenseNet-121) have achieved significant results in image classification tasks. However, the performance of these models is still constrained by issues such as hyperparameter optimization and gradient vanishing and exploding. Owing to their unique exploration and exploitation capabilities, evolutionary algorithms offer new avenues for addressing these problems. Simultaneously, to prevent these algorithms from falling into a local optimum during the search process, this study designs a novel interpolation algorithm. To achieve better image classification performance, thus enhancing classification accuracy and boosting model stability, this paper utilizes a hybrid algorithm based on the horned lizard algorithm with quadratic interpolation and the giant armadillo optimization with Newton interpolation (HGAO) to optimize the hyperparameters of DenseNet-121. It is applied to five datasets spanning different domains. The learning rate and dropout rate have notable impacts on the outcomes of the DenseNet-121 model, which are chosen as the hyperparameters to be optimized. Experiments are conducted using the HGAO algorithm on five image datasets and compared with nine state-of-the-art algorithms. The performance of the model is evaluated based on accuracy, precision, recall, and F1-score metrics. The experimental results reveal that the combination of hyperparameters becomes more reasonable after optimization with the HGAO algorithm, thus providing a crucial improvement. In the comparative experiments, the accuracy of the image classification on the training set increased by up to 0.5%, with a maximum reduction in loss of 0.018. On the test set, the accuracy rose by 0.5%, and the loss decreased by 54 points. The HGAO algorithm provides an effective solution for optimizing the DenseNet-121 model. The designed method boosts classification accuracy and model stability, which also dramatically augments hyperparameter optimization effects and resolves gradient difficulties.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Biomimetics
Biomimetics Biochemistry, Genetics and Molecular Biology-Biotechnology
CiteScore
3.50
自引率
11.10%
发文量
189
审稿时长
11 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信