{"title":"Evaluation of cluster combination functions for mixture of experts","authors":"R. Redhead, M. Heywood","doi":"10.1109/IJCNN.2005.1556016","DOIUrl":null,"url":null,"abstract":"The mixtures of experts (MoE) model provides the basis for building modular neural network solutions. In this work we are interested in methods for decomposing the input before forwarding to the MoE architecture. By doing so we are able to define the number of experts from the data itself. Specific schemes are shown to be appropriate for regression and classification problems, where each appear to have different preferences.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2005.1556016","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
The mixtures of experts (MoE) model provides the basis for building modular neural network solutions. In this work we are interested in methods for decomposing the input before forwarding to the MoE architecture. By doing so we are able to define the number of experts from the data itself. Specific schemes are shown to be appropriate for regression and classification problems, where each appear to have different preferences.