{"title":"Comparative Analysis of Activation Functions Used in the Hidden Layers of Deep Neural Networks","authors":"Martin Kaloev, Georgi Krastev","doi":"10.1109/HORA52670.2021.9461312","DOIUrl":null,"url":null,"abstract":"The development in the field of neural networks opens up opportunities for the use of many activation functions, each of which has its own specific features. This raises questions about how compatible the different activation functions are and whether their exchange affects the operation of a neural network. The article reviews the design, training and research of a Deep Neural Network. The Network is applied for curve recognition Three popular activation functions are analysed in the hidden layers – sigmoid function (Sigmoid), a hyperbolic tangent (tanh) and a rectified linear unit (ReLU). The results of this study will be useful in the design of Deep Neural Networks and in the selection of activation functions.","PeriodicalId":270469,"journal":{"name":"2021 3rd International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 3rd International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HORA52670.2021.9461312","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
The development in the field of neural networks opens up opportunities for the use of many activation functions, each of which has its own specific features. This raises questions about how compatible the different activation functions are and whether their exchange affects the operation of a neural network. The article reviews the design, training and research of a Deep Neural Network. The Network is applied for curve recognition Three popular activation functions are analysed in the hidden layers – sigmoid function (Sigmoid), a hyperbolic tangent (tanh) and a rectified linear unit (ReLU). The results of this study will be useful in the design of Deep Neural Networks and in the selection of activation functions.