Hao Yu, Tiantian Xie, Michael Hamilton, B. Wilamowski
{"title":"Comparison of different neural network architectures for digit image recognition","authors":"Hao Yu, Tiantian Xie, Michael Hamilton, B. Wilamowski","doi":"10.1109/HSI.2011.5937350","DOIUrl":null,"url":null,"abstract":"The paper presents the design of three types of neural networks with different features, including traditional backpropagation networks, radial basis function networks and counterpropagation networks. Traditional backpropagation networks require very complex training process before being applied for classification or approximation. Radial basis function networks simplify the training process by the specially organized 3-layer architecture. Counterpropagation networks do not need training process at all and can be designed directly by extracting all the parameters from input data. Both design complexity and generalization ability of the three types of neural network architectures are compared, based on a digit image recognition problem.","PeriodicalId":384027,"journal":{"name":"2011 4th International Conference on Human System Interactions, HSI 2011","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 4th International Conference on Human System Interactions, HSI 2011","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HSI.2011.5937350","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
The paper presents the design of three types of neural networks with different features, including traditional backpropagation networks, radial basis function networks and counterpropagation networks. Traditional backpropagation networks require very complex training process before being applied for classification or approximation. Radial basis function networks simplify the training process by the specially organized 3-layer architecture. Counterpropagation networks do not need training process at all and can be designed directly by extracting all the parameters from input data. Both design complexity and generalization ability of the three types of neural network architectures are compared, based on a digit image recognition problem.