D. Alpsan, M. Towsey, O. Ozdamar, A. Tsoi, D. Ghista
{"title":"Are modified back-propagation algorithms worth the effort?","authors":"D. Alpsan, M. Towsey, O. Ozdamar, A. Tsoi, D. Ghista","doi":"10.1109/ICNN.1994.374227","DOIUrl":null,"url":null,"abstract":"A wide range of modifications and extensions to the backpropagation (BP) algorithm have been tested on a real world medical problem. Our results show that: 1) proper tuning of learning parameters of standard BP not only increases the speed of learning but also has a significant effect on generalisation; 2) parameter combinations and training options which lead to fast learning do not usually yield good generalisation and vice versa; 3) standard BP may be fast enough when its parameters are finely tuned; 4) modifications developed on artificial problems for faster learning do not necessarily give faster learning on real-world problems, and when they do, it may be at the expense of generalisation; and 5) even when modified BP algorithms perform well, they may require extensive fine-tuning to achieve this performance. For our problem, none of the modifications could justify the effort to implement them.<<ETX>>","PeriodicalId":209128,"journal":{"name":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","volume":"549 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICNN.1994.374227","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
A wide range of modifications and extensions to the backpropagation (BP) algorithm have been tested on a real world medical problem. Our results show that: 1) proper tuning of learning parameters of standard BP not only increases the speed of learning but also has a significant effect on generalisation; 2) parameter combinations and training options which lead to fast learning do not usually yield good generalisation and vice versa; 3) standard BP may be fast enough when its parameters are finely tuned; 4) modifications developed on artificial problems for faster learning do not necessarily give faster learning on real-world problems, and when they do, it may be at the expense of generalisation; and 5) even when modified BP algorithms perform well, they may require extensive fine-tuning to achieve this performance. For our problem, none of the modifications could justify the effort to implement them.<>