Modified Neural Network Activation Function

Adamu I. Abubakar, H. Chiroma, Sameem Abdulkareem, A. Gital, S. A. Muaz, Jafaar Maitama, Muhammad Lamir Isah, T. Herawan
{"title":"Modified Neural Network Activation Function","authors":"Adamu I. Abubakar, H. Chiroma, Sameem Abdulkareem, A. Gital, S. A. Muaz, Jafaar Maitama, Muhammad Lamir Isah, T. Herawan","doi":"10.1109/ICAIET.2014.12","DOIUrl":null,"url":null,"abstract":"Neural Network is said to emulate the brain, though, its processing is not quite how the biological brain really works. The Neural Network has witnessed significant improvement since 1943 to date. However, modifications on the Neural Network mainly focus on the structure itself, not the activation function despite the critical role of activation function in the performance of the Neural Network. In this paper, we present the modification of Neural Network activation function to improve the performance of the Neural Network. The theoretical background of the modification, including mathematical proof is fully described in the paper. The modified activation function is code name as SigHyper. The performance of SigHyper was evaluated against state of the art activation function on the crude oil price dataset. Results suggested that the proposed SigHyper was found to improved accuracy of the Neural Network. Analysis of variance showed that the accuracy of the SigHyper is significant. It was established that the SigHyper require further improvement. The activation function proposed in this research has added to the activation functions already discussed in the literature. The study may motivate researchers to further modify activation functions, hence, improve the performance of the Neural Network.","PeriodicalId":225159,"journal":{"name":"2014 4th International Conference on Artificial Intelligence with Applications in Engineering and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 4th International Conference on Artificial Intelligence with Applications in Engineering and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIET.2014.12","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Neural Network is said to emulate the brain, though, its processing is not quite how the biological brain really works. The Neural Network has witnessed significant improvement since 1943 to date. However, modifications on the Neural Network mainly focus on the structure itself, not the activation function despite the critical role of activation function in the performance of the Neural Network. In this paper, we present the modification of Neural Network activation function to improve the performance of the Neural Network. The theoretical background of the modification, including mathematical proof is fully described in the paper. The modified activation function is code name as SigHyper. The performance of SigHyper was evaluated against state of the art activation function on the crude oil price dataset. Results suggested that the proposed SigHyper was found to improved accuracy of the Neural Network. Analysis of variance showed that the accuracy of the SigHyper is significant. It was established that the SigHyper require further improvement. The activation function proposed in this research has added to the activation functions already discussed in the literature. The study may motivate researchers to further modify activation functions, hence, improve the performance of the Neural Network.
改进的神经网络激活函数
据说神经网络是模仿大脑的,但它的处理方式与生物大脑的实际工作方式并不完全相同。自1943年至今,神经网络有了显著的改进。然而,对神经网络的修改主要集中在结构本身,而不是激活函数,尽管激活函数在神经网络的性能中起着至关重要的作用。本文提出了对神经网络激活函数的修改,以提高神经网络的性能。本文对修正的理论背景,包括数学证明进行了全面阐述。修改后的激活函数代号为SigHyper。根据原油价格数据集上最先进的激活函数对SigHyper的性能进行了评估。结果表明,所提出的SigHyper可以提高神经网络的准确性。方差分析表明,SigHyper的准确率显著。确定了SigHyper需要进一步改进。本研究提出的激活函数是对文献中讨论的激活函数的补充。该研究可能激励研究人员进一步修改激活函数,从而提高神经网络的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信