Adamu I. Abubakar, H. Chiroma, Sameem Abdulkareem, A. Gital, S. A. Muaz, Jafaar Maitama, Muhammad Lamir Isah, T. Herawan
{"title":"Modified Neural Network Activation Function","authors":"Adamu I. Abubakar, H. Chiroma, Sameem Abdulkareem, A. Gital, S. A. Muaz, Jafaar Maitama, Muhammad Lamir Isah, T. Herawan","doi":"10.1109/ICAIET.2014.12","DOIUrl":null,"url":null,"abstract":"Neural Network is said to emulate the brain, though, its processing is not quite how the biological brain really works. The Neural Network has witnessed significant improvement since 1943 to date. However, modifications on the Neural Network mainly focus on the structure itself, not the activation function despite the critical role of activation function in the performance of the Neural Network. In this paper, we present the modification of Neural Network activation function to improve the performance of the Neural Network. The theoretical background of the modification, including mathematical proof is fully described in the paper. The modified activation function is code name as SigHyper. The performance of SigHyper was evaluated against state of the art activation function on the crude oil price dataset. Results suggested that the proposed SigHyper was found to improved accuracy of the Neural Network. Analysis of variance showed that the accuracy of the SigHyper is significant. It was established that the SigHyper require further improvement. The activation function proposed in this research has added to the activation functions already discussed in the literature. The study may motivate researchers to further modify activation functions, hence, improve the performance of the Neural Network.","PeriodicalId":225159,"journal":{"name":"2014 4th International Conference on Artificial Intelligence with Applications in Engineering and Technology","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-03-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 4th International Conference on Artificial Intelligence with Applications in Engineering and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIET.2014.12","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Neural Network is said to emulate the brain, though, its processing is not quite how the biological brain really works. The Neural Network has witnessed significant improvement since 1943 to date. However, modifications on the Neural Network mainly focus on the structure itself, not the activation function despite the critical role of activation function in the performance of the Neural Network. In this paper, we present the modification of Neural Network activation function to improve the performance of the Neural Network. The theoretical background of the modification, including mathematical proof is fully described in the paper. The modified activation function is code name as SigHyper. The performance of SigHyper was evaluated against state of the art activation function on the crude oil price dataset. Results suggested that the proposed SigHyper was found to improved accuracy of the Neural Network. Analysis of variance showed that the accuracy of the SigHyper is significant. It was established that the SigHyper require further improvement. The activation function proposed in this research has added to the activation functions already discussed in the literature. The study may motivate researchers to further modify activation functions, hence, improve the performance of the Neural Network.