{"title":"Batch linear least squares-based learning algorithm for MLMVN with soft margins","authors":"E. Aizenberg, I. Aizenberg","doi":"10.1109/CIDM.2014.7008147","DOIUrl":null,"url":null,"abstract":"In this paper, we consider a batch learning algorithm for the multilayer neural network with multi-valued neurons (MLMVN) and its soft margins variant (MLMVN-SM). MLMVN is a neural network with a standard feedforward organization based on the multi-valued neuron (MVN). MVN is a neuron with complex-valued weights and inputs/output located on the unit circle. Standard MLMVN has a derivative-free learning algorithm based on the error-correction learning rule. Recently, this algorithm was modified for MLMVN with discrete outputs by using soft margins (MLMVN-SM). This modification improves classification results when MLMVN is used as a classifier. Another recent development in MLMVN is the use of batch acceleration step for MLMVN with a single output neuron. Complex QR-decomposition was used to adjust the output neuron weights for all learning samples simultaneously, while the hidden neuron weights were adjusted in a regular way. In this paper, we merge the soft margins approach with batch learning. We suggest a batch linear least squares (LLS) learning algorithm for MLMVN-SM. We also expand the batch technique to multiple output neurons and hidden neurons. This new learning technique drastically reduces the number of learning iterations and learning time when solving classification problems (compared to MLMVN-SM), while maintaining the classification accuracy of MLMVN-SM.","PeriodicalId":117542,"journal":{"name":"2014 IEEE Symposium on Computational Intelligence and Data Mining (CIDM)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Symposium on Computational Intelligence and Data Mining (CIDM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIDM.2014.7008147","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
In this paper, we consider a batch learning algorithm for the multilayer neural network with multi-valued neurons (MLMVN) and its soft margins variant (MLMVN-SM). MLMVN is a neural network with a standard feedforward organization based on the multi-valued neuron (MVN). MVN is a neuron with complex-valued weights and inputs/output located on the unit circle. Standard MLMVN has a derivative-free learning algorithm based on the error-correction learning rule. Recently, this algorithm was modified for MLMVN with discrete outputs by using soft margins (MLMVN-SM). This modification improves classification results when MLMVN is used as a classifier. Another recent development in MLMVN is the use of batch acceleration step for MLMVN with a single output neuron. Complex QR-decomposition was used to adjust the output neuron weights for all learning samples simultaneously, while the hidden neuron weights were adjusted in a regular way. In this paper, we merge the soft margins approach with batch learning. We suggest a batch linear least squares (LLS) learning algorithm for MLMVN-SM. We also expand the batch technique to multiple output neurons and hidden neurons. This new learning technique drastically reduces the number of learning iterations and learning time when solving classification problems (compared to MLMVN-SM), while maintaining the classification accuracy of MLMVN-SM.