Soft-Clipping Swish: A Novel Activation Function for Deep Learning

Marina Adriana Mercioni, S. Holban
{"title":"Soft-Clipping Swish: A Novel Activation Function for Deep Learning","authors":"Marina Adriana Mercioni, S. Holban","doi":"10.1109/SACI51354.2021.9465622","DOIUrl":null,"url":null,"abstract":"This study aims to contribute to the improvement of the network’s performance through developing a novel activation function. Over time, many activation functions have been proposed in order to solve the issues of the previous functions. We note here more than 50 activation functions that have been proposed, some of them being very popular such as sigmoid, Rectified Linear Unit (ReLU), Swish, Mish but not only. The main idea of this study that stays behind our proposal is a simple one, based on a very popular function called Swish, which is a composition function, having in its componence sigmoid function and ReLU function. Starting from this activation function we decided to ignore the negative region in the way the Rectified Linear Unit does but being different than that one mentioned through a nonlinear curve assured by the Swish positive region. The idea has been come up from a current function called Soft Clipping. We tested this proposal on more datasets in Computer Vision on classification tasks showing its high potential, here we mention MNIST, Fashion-MNIST, CIFAR-10, CIFAR-100 using two popular architectures: LeNet-5 and ResNet20 version 1.","PeriodicalId":321907,"journal":{"name":"2021 IEEE 15th International Symposium on Applied Computational Intelligence and Informatics (SACI)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 15th International Symposium on Applied Computational Intelligence and Informatics (SACI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SACI51354.2021.9465622","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

This study aims to contribute to the improvement of the network’s performance through developing a novel activation function. Over time, many activation functions have been proposed in order to solve the issues of the previous functions. We note here more than 50 activation functions that have been proposed, some of them being very popular such as sigmoid, Rectified Linear Unit (ReLU), Swish, Mish but not only. The main idea of this study that stays behind our proposal is a simple one, based on a very popular function called Swish, which is a composition function, having in its componence sigmoid function and ReLU function. Starting from this activation function we decided to ignore the negative region in the way the Rectified Linear Unit does but being different than that one mentioned through a nonlinear curve assured by the Swish positive region. The idea has been come up from a current function called Soft Clipping. We tested this proposal on more datasets in Computer Vision on classification tasks showing its high potential, here we mention MNIST, Fashion-MNIST, CIFAR-10, CIFAR-100 using two popular architectures: LeNet-5 and ResNet20 version 1.
一种新的用于深度学习的激活函数
本研究旨在通过开发一种新的激活函数来改善神经网络的性能。随着时间的推移,人们提出了许多激活函数来解决以前函数的问题。我们注意到这里已经提出了50多个激活函数,其中一些非常流行,如sigmoid,整流线性单元(ReLU), Swish, Mish,但不仅如此。这项研究的主要思想,在我们的提议背后是一个简单的,基于一个非常流行的函数Swish,这是一个组合函数,在它的组件有sigmoid函数和ReLU函数。从这个激活函数开始,我们决定以整流线性单元的方式忽略负区域,但与通过Swish正区域保证的非线性曲线所提到的负区域不同。这个想法来自于一个名为“软剪辑”的当前功能。我们在计算机视觉的分类任务中测试了更多的数据集,显示了它的高潜力,在这里我们提到了MNIST, Fashion-MNIST, CIFAR-10, CIFAR-100,使用了两种流行的架构:LeNet-5和ResNet20版本1。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信