M. El Adawy, M. Aboulwafa, H. Keshk, M.M. El Tayeb
{"title":"A SOFT-backpropagation algorithm for training neural networks","authors":"M. El Adawy, M. Aboulwafa, H. Keshk, M.M. El Tayeb","doi":"10.1109/NRSC.2002.1022647","DOIUrl":null,"url":null,"abstract":"The backpropagation (BP) algorithm is a one of the most common algorithms used in the training of neural networks. The single offspring technique (SOFT algorithm) is a new technique (see Likartsis, A. et al., Proc. 9th Int. Conf. on Tools with Artificial Intelligence, p.32-6, 1997; Yao, X., Proc. IEEE, vol.87, p.1425-47, 1999) of applying the genetic algorithm in the training of neural networks which reduces the training time as compared with the backpropagation algorithm. We introduce a new technique. This technique is a hybrid SOFT-BP algorithm where the SOFT-algorithm is applied first to obtain an initially good weight vector. This vector is introduced to the backpropagation algorithm, which improves the precession of the weight vector to reach an acceptable error limit. The results show an acceptable improvement in the training speed for the hybrid technique as compared with the individual backpropagation or SOFT algorithm. We also study the success ratio (how many times the algorithm succeeds in finding a solution to the total number of trials) for the new hybrid algorithm. A recommended range of the switching error limit at which to switch from the SOFT algorithm to the BP algorithm is suggested.","PeriodicalId":231600,"journal":{"name":"Proceedings of the Nineteenth National Radio Science Conference","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Nineteenth National Radio Science Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NRSC.2002.1022647","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8
Abstract
The backpropagation (BP) algorithm is a one of the most common algorithms used in the training of neural networks. The single offspring technique (SOFT algorithm) is a new technique (see Likartsis, A. et al., Proc. 9th Int. Conf. on Tools with Artificial Intelligence, p.32-6, 1997; Yao, X., Proc. IEEE, vol.87, p.1425-47, 1999) of applying the genetic algorithm in the training of neural networks which reduces the training time as compared with the backpropagation algorithm. We introduce a new technique. This technique is a hybrid SOFT-BP algorithm where the SOFT-algorithm is applied first to obtain an initially good weight vector. This vector is introduced to the backpropagation algorithm, which improves the precession of the weight vector to reach an acceptable error limit. The results show an acceptable improvement in the training speed for the hybrid technique as compared with the individual backpropagation or SOFT algorithm. We also study the success ratio (how many times the algorithm succeeds in finding a solution to the total number of trials) for the new hybrid algorithm. A recommended range of the switching error limit at which to switch from the SOFT algorithm to the BP algorithm is suggested.
反向传播(BP)算法是神经网络训练中最常用的算法之一。单子代技术(SOFT算法)是一种新技术(见Likartsis, a . et al., Proc. 9th Int.)。《工具与人工智能》,p.32-6, 1997;姚晓东,Proc. IEEE, vol.87, p.1425-47, 1999),遗传算法在神经网络训练中的应用,与反向传播算法相比,减少了训练时间。我们引进了一项新技术。该技术是一种混合的SOFT-BP算法,首先应用SOFT-BP算法来获得初始良好的权重向量。将该向量引入到反向传播算法中,改进了权向量的进动,使其达到可接受的误差范围。结果表明,与单独的反向传播或SOFT算法相比,混合技术的训练速度有了可接受的提高。我们还研究了新混合算法的成功率(算法成功找到总试验次数的解的次数)。提出了从SOFT算法切换到BP算法的推荐切换误差限制范围。