{"title":"超参数对反向传播的影响","authors":"Aaditree Jaisswal, Anjali Naik","doi":"10.1109/punecon52575.2021.9686489","DOIUrl":null,"url":null,"abstract":"In machine learning algorithms, parameters and hy-perparameters are important properties in the training process. Parameters are modified through machine learning algorithms while hyperparameters are the parameters that are adjusted manually to achieve the desired accuracy and increase efficiency. In neural networks, weights are parameters while hyper-parameters include layer size, momentum, learning rate, acti-vation function family, weight initialization and normalization scheme for input data. Multi-layer feed-forward Backpropagation neural network (BPN) has been used to solve a variety of classification problems as it can map “any” input-output mapping and classify linearly inseparable data. The research work presented in this paper gives a detailed description of BPN and explains various hyper-mutates for BPN. The paper describes the implementation of BPN, experimentation and analysis with hyperparameters like weight initialization and learning.","PeriodicalId":154406,"journal":{"name":"2021 IEEE Pune Section International Conference (PuneCon)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Effect of Hyperparameters on Backpropagation\",\"authors\":\"Aaditree Jaisswal, Anjali Naik\",\"doi\":\"10.1109/punecon52575.2021.9686489\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In machine learning algorithms, parameters and hy-perparameters are important properties in the training process. Parameters are modified through machine learning algorithms while hyperparameters are the parameters that are adjusted manually to achieve the desired accuracy and increase efficiency. In neural networks, weights are parameters while hyper-parameters include layer size, momentum, learning rate, acti-vation function family, weight initialization and normalization scheme for input data. Multi-layer feed-forward Backpropagation neural network (BPN) has been used to solve a variety of classification problems as it can map “any” input-output mapping and classify linearly inseparable data. The research work presented in this paper gives a detailed description of BPN and explains various hyper-mutates for BPN. The paper describes the implementation of BPN, experimentation and analysis with hyperparameters like weight initialization and learning.\",\"PeriodicalId\":154406,\"journal\":{\"name\":\"2021 IEEE Pune Section International Conference (PuneCon)\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE Pune Section International Conference (PuneCon)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/punecon52575.2021.9686489\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Pune Section International Conference (PuneCon)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/punecon52575.2021.9686489","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
In machine learning algorithms, parameters and hy-perparameters are important properties in the training process. Parameters are modified through machine learning algorithms while hyperparameters are the parameters that are adjusted manually to achieve the desired accuracy and increase efficiency. In neural networks, weights are parameters while hyper-parameters include layer size, momentum, learning rate, acti-vation function family, weight initialization and normalization scheme for input data. Multi-layer feed-forward Backpropagation neural network (BPN) has been used to solve a variety of classification problems as it can map “any” input-output mapping and classify linearly inseparable data. The research work presented in this paper gives a detailed description of BPN and explains various hyper-mutates for BPN. The paper describes the implementation of BPN, experimentation and analysis with hyperparameters like weight initialization and learning.