{"title":"A hyper-heuristic enhanced neuro-evolutionary algorithm with self-adaptive operators and various activation functions for classification problems","authors":"Fehmi Burcin Ozsoydan , İlker Gölcük , Esra Duygu Durmaz","doi":"10.1016/j.neunet.2025.107751","DOIUrl":null,"url":null,"abstract":"<div><div>Due to their remarkable generalization capabilities, Artificial Neural Networks (ANNs) grab attention of researchers and practitioners. ANNs have two main stages, namely training and testing. The training stage aims to find optimum synapse values. Traditional gradient-descent-based approaches offer notable advantages in training ANNs, nevertheless, they are exposed to some limitations such as convergence and local minima issues. Therefore, stochastic search algorithms are commonly employed. In this regard, the present study adopts and further extends the evolutionary search strategies namely, the Differential Evolution (DE) algorithm and the Self-Adaptive Differential Evolution (SaDE) Algorithm in training ANNs. Accordingly, self-adaptation procedures are modified to perform a more convenient search and to avoid local optima first. Secondarily, mutation operations in these algorithms are reinforced by a hyper-heuristic framework, which selects low-level heuristics out of a heuristics pool, based on their achievements throughout search. Thus, to achieve better performance, while promising mechanisms are more frequently invoked by the proposed approach, naïve operators are less frequently invoked. This implicitly avoids greedy behavior in selecting low-level heuristics and attempts to overcome loss-of-diversity and local optima issues. Moreover, due to possible complexity and nonlinearity in mapping between inputs and outputs, the proposed method is also tested by using various activation functions. Thus, a hyper-heuristic enhanced neuro-evolutionary algorithm with self-adaptive operators is introduced. Finally, the performances of all used methods with various activation functions are tested on the well-known classification problems. Statistically verified computational results point out significant differences among the algorithms and used activation functions.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"190 ","pages":"Article 107751"},"PeriodicalIF":6.3000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025006318","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Due to their remarkable generalization capabilities, Artificial Neural Networks (ANNs) grab attention of researchers and practitioners. ANNs have two main stages, namely training and testing. The training stage aims to find optimum synapse values. Traditional gradient-descent-based approaches offer notable advantages in training ANNs, nevertheless, they are exposed to some limitations such as convergence and local minima issues. Therefore, stochastic search algorithms are commonly employed. In this regard, the present study adopts and further extends the evolutionary search strategies namely, the Differential Evolution (DE) algorithm and the Self-Adaptive Differential Evolution (SaDE) Algorithm in training ANNs. Accordingly, self-adaptation procedures are modified to perform a more convenient search and to avoid local optima first. Secondarily, mutation operations in these algorithms are reinforced by a hyper-heuristic framework, which selects low-level heuristics out of a heuristics pool, based on their achievements throughout search. Thus, to achieve better performance, while promising mechanisms are more frequently invoked by the proposed approach, naïve operators are less frequently invoked. This implicitly avoids greedy behavior in selecting low-level heuristics and attempts to overcome loss-of-diversity and local optima issues. Moreover, due to possible complexity and nonlinearity in mapping between inputs and outputs, the proposed method is also tested by using various activation functions. Thus, a hyper-heuristic enhanced neuro-evolutionary algorithm with self-adaptive operators is introduced. Finally, the performances of all used methods with various activation functions are tested on the well-known classification problems. Statistically verified computational results point out significant differences among the algorithms and used activation functions.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.