{"title":"Euclidean input mapping in a N-tuple approximation network","authors":"A. Kolcz, N. Allinson","doi":"10.1109/DSP.1994.379821","DOIUrl":null,"url":null,"abstract":"A type of the N-tuple neural architecture can be shown to perform function approximation based on local interpolation, similar that performed by RBF networks. Since the size and speed of operation in this implementation are independent of the training set size, it is attractive for practical adaptive solutions. However, the kernel function used by the network is non-Euclidean, which can cause performance losses for high-dimensional input data. The authors investigate methods for realising more isotropic kernel basis functions by use of special data encoding techniques.<<ETX>>","PeriodicalId":189083,"journal":{"name":"Proceedings of IEEE 6th Digital Signal Processing Workshop","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of IEEE 6th Digital Signal Processing Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DSP.1994.379821","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
A type of the N-tuple neural architecture can be shown to perform function approximation based on local interpolation, similar that performed by RBF networks. Since the size and speed of operation in this implementation are independent of the training set size, it is attractive for practical adaptive solutions. However, the kernel function used by the network is non-Euclidean, which can cause performance losses for high-dimensional input data. The authors investigate methods for realising more isotropic kernel basis functions by use of special data encoding techniques.<>