Narayana Darapaneni, A. Paduri, Anjan Arun Bhowmick, P. Ranjini, T. Kavitha, Suresh Rajendran, N. Veeresh, N. Vignesh
{"title":"一种用于图像分类的非单调激活函数","authors":"Narayana Darapaneni, A. Paduri, Anjan Arun Bhowmick, P. Ranjini, T. Kavitha, Suresh Rajendran, N. Veeresh, N. Vignesh","doi":"10.1109/IConSCEPT57958.2023.10170022","DOIUrl":null,"url":null,"abstract":"By providing non-linearity and enabling the network to understand complicated associations in the data, activation functions play a vital role in the performance of neural networks. Here, we introduce Esh, a brand-new activation function with the formula, $f(x) = xtanh(sigmoid(x))$. Using CNN architectures, we assess Esh’s performance on the MNIST, CIFAR10, and CIFAR-100 data sets. Our tests demonstrate that the Esh activation function outperforms a number of well-known activation functions, including ReLU, GELU, Mish, and Swish. In fact, compared to other activation functions, the Esh activation function has a more consistent loss landscape. Esh is a potential new activation function for deep neural networks, according to the findings of our study, and we anticipate that it will be widely used in the machine learning industry.","PeriodicalId":240167,"journal":{"name":"2023 International Conference on Signal Processing, Computation, Electronics, Power and Telecommunication (IConSCEPT)","volume":"17 4","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ESH: A Non-Monotonic Activation Function For Image Classification\",\"authors\":\"Narayana Darapaneni, A. Paduri, Anjan Arun Bhowmick, P. Ranjini, T. Kavitha, Suresh Rajendran, N. Veeresh, N. Vignesh\",\"doi\":\"10.1109/IConSCEPT57958.2023.10170022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"By providing non-linearity and enabling the network to understand complicated associations in the data, activation functions play a vital role in the performance of neural networks. Here, we introduce Esh, a brand-new activation function with the formula, $f(x) = xtanh(sigmoid(x))$. Using CNN architectures, we assess Esh’s performance on the MNIST, CIFAR10, and CIFAR-100 data sets. Our tests demonstrate that the Esh activation function outperforms a number of well-known activation functions, including ReLU, GELU, Mish, and Swish. In fact, compared to other activation functions, the Esh activation function has a more consistent loss landscape. Esh is a potential new activation function for deep neural networks, according to the findings of our study, and we anticipate that it will be widely used in the machine learning industry.\",\"PeriodicalId\":240167,\"journal\":{\"name\":\"2023 International Conference on Signal Processing, Computation, Electronics, Power and Telecommunication (IConSCEPT)\",\"volume\":\"17 4\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Signal Processing, Computation, Electronics, Power and Telecommunication (IConSCEPT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IConSCEPT57958.2023.10170022\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Signal Processing, Computation, Electronics, Power and Telecommunication (IConSCEPT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IConSCEPT57958.2023.10170022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
ESH: A Non-Monotonic Activation Function For Image Classification
By providing non-linearity and enabling the network to understand complicated associations in the data, activation functions play a vital role in the performance of neural networks. Here, we introduce Esh, a brand-new activation function with the formula, $f(x) = xtanh(sigmoid(x))$. Using CNN architectures, we assess Esh’s performance on the MNIST, CIFAR10, and CIFAR-100 data sets. Our tests demonstrate that the Esh activation function outperforms a number of well-known activation functions, including ReLU, GELU, Mish, and Swish. In fact, compared to other activation functions, the Esh activation function has a more consistent loss landscape. Esh is a potential new activation function for deep neural networks, according to the findings of our study, and we anticipate that it will be widely used in the machine learning industry.