{"title":"s型激活函数的随机参数化","authors":"Yi-Hsien Lin","doi":"10.1145/3573834.3574486","DOIUrl":null,"url":null,"abstract":"Activation functions are integral components in neural networks because they are the calculations that each neuron performs on their inputted data before outputting it to the next neuron to help the neural network match the ground truth of the data sooner, thus converging faster. However, popular activation functions are not parameterized and those that are, have too few parameters, therefore lacking the ability to fully train the shape of the activation function. This paper introduces RPSigmoid, an activation function based on the Sigmoid function, and with four additional parameters which represent the vertical stretch factor, horizontal stretch factor, angularity, and slope of the asymptotes (which might be horizontal or oblique) of the sigmoidal curve. These parameters are randomized within a range before training and their values are updated along with other neural network parameters during backpropagation. Affirmative results of RPSigmoid present neural network training with a low-resource approach to yield impressive training results.","PeriodicalId":345434,"journal":{"name":"Proceedings of the 4th International Conference on Advanced Information Science and System","volume":"111 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"RPSigmoid: A Randomized Parameterization for a Sigmoidal Activation Function\",\"authors\":\"Yi-Hsien Lin\",\"doi\":\"10.1145/3573834.3574486\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Activation functions are integral components in neural networks because they are the calculations that each neuron performs on their inputted data before outputting it to the next neuron to help the neural network match the ground truth of the data sooner, thus converging faster. However, popular activation functions are not parameterized and those that are, have too few parameters, therefore lacking the ability to fully train the shape of the activation function. This paper introduces RPSigmoid, an activation function based on the Sigmoid function, and with four additional parameters which represent the vertical stretch factor, horizontal stretch factor, angularity, and slope of the asymptotes (which might be horizontal or oblique) of the sigmoidal curve. These parameters are randomized within a range before training and their values are updated along with other neural network parameters during backpropagation. Affirmative results of RPSigmoid present neural network training with a low-resource approach to yield impressive training results.\",\"PeriodicalId\":345434,\"journal\":{\"name\":\"Proceedings of the 4th International Conference on Advanced Information Science and System\",\"volume\":\"111 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 4th International Conference on Advanced Information Science and System\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3573834.3574486\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Conference on Advanced Information Science and System","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3573834.3574486","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
RPSigmoid: A Randomized Parameterization for a Sigmoidal Activation Function
Activation functions are integral components in neural networks because they are the calculations that each neuron performs on their inputted data before outputting it to the next neuron to help the neural network match the ground truth of the data sooner, thus converging faster. However, popular activation functions are not parameterized and those that are, have too few parameters, therefore lacking the ability to fully train the shape of the activation function. This paper introduces RPSigmoid, an activation function based on the Sigmoid function, and with four additional parameters which represent the vertical stretch factor, horizontal stretch factor, angularity, and slope of the asymptotes (which might be horizontal or oblique) of the sigmoidal curve. These parameters are randomized within a range before training and their values are updated along with other neural network parameters during backpropagation. Affirmative results of RPSigmoid present neural network training with a low-resource approach to yield impressive training results.