{"title":"具有随机小波函数参数的小波神经网络","authors":"H. Bazoobandi","doi":"10.5829/ije.2017.30.10a.12","DOIUrl":null,"url":null,"abstract":"The training algorithm of Wavelet Neural Networks (WNN) is a bottleneck which impacts on the accuracy of the final WNN model. Several methods have been proposed for training the WNNs. From the perspective of our research, most of these algorithms are iterative and need to adjust all the parameters of WNN. This paper proposes a one-step learning method which changes the weights between hidden layer and output layer of the network; meanwhile, the wavelet function parameters are randomly assigned and kept fixed during the training process. Besides the simplicity and speed of the proposed one-step algorithm, the experimental results verify the performance of the proposed method in terms of final model accuracy and computational time.","PeriodicalId":416886,"journal":{"name":"International journal of engineering. Transactions A: basics","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"WAVELET NEURAL NETWORK WITH RANDOM WAVELET FUNCTION PARAMETERS\",\"authors\":\"H. Bazoobandi\",\"doi\":\"10.5829/ije.2017.30.10a.12\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The training algorithm of Wavelet Neural Networks (WNN) is a bottleneck which impacts on the accuracy of the final WNN model. Several methods have been proposed for training the WNNs. From the perspective of our research, most of these algorithms are iterative and need to adjust all the parameters of WNN. This paper proposes a one-step learning method which changes the weights between hidden layer and output layer of the network; meanwhile, the wavelet function parameters are randomly assigned and kept fixed during the training process. Besides the simplicity and speed of the proposed one-step algorithm, the experimental results verify the performance of the proposed method in terms of final model accuracy and computational time.\",\"PeriodicalId\":416886,\"journal\":{\"name\":\"International journal of engineering. Transactions A: basics\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-08-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of engineering. Transactions A: basics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5829/ije.2017.30.10a.12\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of engineering. Transactions A: basics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5829/ije.2017.30.10a.12","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
WAVELET NEURAL NETWORK WITH RANDOM WAVELET FUNCTION PARAMETERS
The training algorithm of Wavelet Neural Networks (WNN) is a bottleneck which impacts on the accuracy of the final WNN model. Several methods have been proposed for training the WNNs. From the perspective of our research, most of these algorithms are iterative and need to adjust all the parameters of WNN. This paper proposes a one-step learning method which changes the weights between hidden layer and output layer of the network; meanwhile, the wavelet function parameters are randomly assigned and kept fixed during the training process. Besides the simplicity and speed of the proposed one-step algorithm, the experimental results verify the performance of the proposed method in terms of final model accuracy and computational time.