Abaid Ullah, Muhammad Imran, Muhammad Abdul Basit, Madeeha Tahir, Jihad Younis
{"title":"AHerfReLU: A Novel Adaptive Activation Function Enhancing Deep Neural Network Performance","authors":"Abaid Ullah, Muhammad Imran, Muhammad Abdul Basit, Madeeha Tahir, Jihad Younis","doi":"10.1155/cplx/8233876","DOIUrl":null,"url":null,"abstract":"<div>\n <p>In deep learning, the choice of activation function plays a vital role in enhancing model performance. We propose AHerfReLU, a novel activation function that combines the rectified linear unit (ReLU) function with the error function (erf), complemented by a regularization term 1/(1 + <i>x</i><sup>2</sup>), ensuring smooth gradients even for negative inputs. The function is zero centered, bounded below, and nonmonotonic, offering significant advantages over traditional activation functions like ReLU. We compare AHerfReLU with 10 adaptive activation functions and state-of-the-art activation functions, including ReLU, Swish, and Mish. Experimental results show that replacing ReLU with AHerfReLU leads to 3.18% improvement in Top-1 accuracy on the LeNet network for the CIFAR100 dataset, 0.63% improvement on CIFAR10%, and 1.3% improvement in mean average precision (mAP) on the SSD300 model in the Pascal VOC dataset. Our results demonstrate that AHerfReLU enhances model performance, offering improved accuracy, loss reduction, and convergence stability. The function outperforms existing activation functions, providing a promising alternative for deep learning tasks.</p>\n </div>","PeriodicalId":50653,"journal":{"name":"Complexity","volume":"2025 1","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2025-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1155/cplx/8233876","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Complexity","FirstCategoryId":"5","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1155/cplx/8233876","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
In deep learning, the choice of activation function plays a vital role in enhancing model performance. We propose AHerfReLU, a novel activation function that combines the rectified linear unit (ReLU) function with the error function (erf), complemented by a regularization term 1/(1 + x2), ensuring smooth gradients even for negative inputs. The function is zero centered, bounded below, and nonmonotonic, offering significant advantages over traditional activation functions like ReLU. We compare AHerfReLU with 10 adaptive activation functions and state-of-the-art activation functions, including ReLU, Swish, and Mish. Experimental results show that replacing ReLU with AHerfReLU leads to 3.18% improvement in Top-1 accuracy on the LeNet network for the CIFAR100 dataset, 0.63% improvement on CIFAR10%, and 1.3% improvement in mean average precision (mAP) on the SSD300 model in the Pascal VOC dataset. Our results demonstrate that AHerfReLU enhances model performance, offering improved accuracy, loss reduction, and convergence stability. The function outperforms existing activation functions, providing a promising alternative for deep learning tasks.
期刊介绍:
Complexity is a cross-disciplinary journal focusing on the rapidly expanding science of complex adaptive systems. The purpose of the journal is to advance the science of complexity. Articles may deal with such methodological themes as chaos, genetic algorithms, cellular automata, neural networks, and evolutionary game theory. Papers treating applications in any area of natural science or human endeavor are welcome, and especially encouraged are papers integrating conceptual themes and applications that cross traditional disciplinary boundaries. Complexity is not meant to serve as a forum for speculation and vague analogies between words like “chaos,” “self-organization,” and “emergence” that are often used in completely different ways in science and in daily life.