{"title":"Hybrid Algorithms Based on Two Evolutionary Computations for Image Classification.","authors":"Peiyang Wei, Rundong Zou, Jianhong Gan, Zhibin Li","doi":"10.3390/biomimetics10080544","DOIUrl":null,"url":null,"abstract":"<p><p>Convolutional neural networks (CNNs) and their improved models (like DenseNet-121) have achieved significant results in image classification tasks. However, the performance of these models is still constrained by issues such as hyperparameter optimization and gradient vanishing and exploding. Owing to their unique exploration and exploitation capabilities, evolutionary algorithms offer new avenues for addressing these problems. Simultaneously, to prevent these algorithms from falling into a local optimum during the search process, this study designs a novel interpolation algorithm. To achieve better image classification performance, thus enhancing classification accuracy and boosting model stability, this paper utilizes a hybrid algorithm based on the horned lizard algorithm with quadratic interpolation and the giant armadillo optimization with Newton interpolation (HGAO) to optimize the hyperparameters of DenseNet-121. It is applied to five datasets spanning different domains. The learning rate and dropout rate have notable impacts on the outcomes of the DenseNet-121 model, which are chosen as the hyperparameters to be optimized. Experiments are conducted using the HGAO algorithm on five image datasets and compared with nine state-of-the-art algorithms. The performance of the model is evaluated based on accuracy, precision, recall, and F1-score metrics. The experimental results reveal that the combination of hyperparameters becomes more reasonable after optimization with the HGAO algorithm, thus providing a crucial improvement. In the comparative experiments, the accuracy of the image classification on the training set increased by up to 0.5%, with a maximum reduction in loss of 0.018. On the test set, the accuracy rose by 0.5%, and the loss decreased by 54 points. The HGAO algorithm provides an effective solution for optimizing the DenseNet-121 model. The designed method boosts classification accuracy and model stability, which also dramatically augments hyperparameter optimization effects and resolves gradient difficulties.</p>","PeriodicalId":8907,"journal":{"name":"Biomimetics","volume":"10 8","pages":""},"PeriodicalIF":3.9000,"publicationDate":"2025-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12383529/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biomimetics","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.3390/biomimetics10080544","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Convolutional neural networks (CNNs) and their improved models (like DenseNet-121) have achieved significant results in image classification tasks. However, the performance of these models is still constrained by issues such as hyperparameter optimization and gradient vanishing and exploding. Owing to their unique exploration and exploitation capabilities, evolutionary algorithms offer new avenues for addressing these problems. Simultaneously, to prevent these algorithms from falling into a local optimum during the search process, this study designs a novel interpolation algorithm. To achieve better image classification performance, thus enhancing classification accuracy and boosting model stability, this paper utilizes a hybrid algorithm based on the horned lizard algorithm with quadratic interpolation and the giant armadillo optimization with Newton interpolation (HGAO) to optimize the hyperparameters of DenseNet-121. It is applied to five datasets spanning different domains. The learning rate and dropout rate have notable impacts on the outcomes of the DenseNet-121 model, which are chosen as the hyperparameters to be optimized. Experiments are conducted using the HGAO algorithm on five image datasets and compared with nine state-of-the-art algorithms. The performance of the model is evaluated based on accuracy, precision, recall, and F1-score metrics. The experimental results reveal that the combination of hyperparameters becomes more reasonable after optimization with the HGAO algorithm, thus providing a crucial improvement. In the comparative experiments, the accuracy of the image classification on the training set increased by up to 0.5%, with a maximum reduction in loss of 0.018. On the test set, the accuracy rose by 0.5%, and the loss decreased by 54 points. The HGAO algorithm provides an effective solution for optimizing the DenseNet-121 model. The designed method boosts classification accuracy and model stability, which also dramatically augments hyperparameter optimization effects and resolves gradient difficulties.