{"title":"From ReLU to GeMU: Activation functions in the lens of cone projection","authors":"Jiayun Li, Yuxiao Cheng, Yiwen Lu, Zhuofan Xia, Yilin Mo, Gao Huang","doi":"10.1016/j.neunet.2025.107654","DOIUrl":null,"url":null,"abstract":"<div><div>Activation functions are essential to introduce nonlinearity into neural networks, with the Rectified Linear Unit (ReLU) often favored for its simplicity and effectiveness. Motivated by the structural similarity between a single layer of the Feedforward Neural Network (FNN) and a single iteration of the Projected Gradient Descent (PGD) algorithm for constrained optimization problems, we consider ReLU as a projection from <span><math><mi>R</mi></math></span> onto the nonnegative half-line <span><math><msub><mrow><mi>R</mi></mrow><mrow><mo>+</mo></mrow></msub></math></span>. Building on this interpretation, we generalize ReLU to a Generalized Multivariate projection Unit (GeMU), a projection operator onto a convex cone, such as the Second-Order Cone (SOC). We prove that the expressive power of FNNs activated by our proposed GeMU is strictly greater than those activated by ReLU. Experimental evaluations further corroborate that GeMU is versatile across prevalent architectures and distinct tasks, and that it can outperform various existing activation functions.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"190 ","pages":"Article 107654"},"PeriodicalIF":6.0000,"publicationDate":"2025-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025005349","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Activation functions are essential to introduce nonlinearity into neural networks, with the Rectified Linear Unit (ReLU) often favored for its simplicity and effectiveness. Motivated by the structural similarity between a single layer of the Feedforward Neural Network (FNN) and a single iteration of the Projected Gradient Descent (PGD) algorithm for constrained optimization problems, we consider ReLU as a projection from onto the nonnegative half-line . Building on this interpretation, we generalize ReLU to a Generalized Multivariate projection Unit (GeMU), a projection operator onto a convex cone, such as the Second-Order Cone (SOC). We prove that the expressive power of FNNs activated by our proposed GeMU is strictly greater than those activated by ReLU. Experimental evaluations further corroborate that GeMU is versatile across prevalent architectures and distinct tasks, and that it can outperform various existing activation functions.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.