{"title":"Using structure of data to improve classification","authors":"C. O'keefe, G. Jarrad","doi":"10.1109/IDC.2002.995419","DOIUrl":null,"url":null,"abstract":"Statistical mixture-of-experts models are often used for data analysis tasks such as clustering, regression and classification. We consider two mixture-of-experts models, the shared mixture classifier and the hierarchical mixture-of-experts classifier. We discuss the initialisation and optimisation of the structure and parameters of each classifier. In particular, we initialise the hierarchical mixture of experts classifier with the public domain OC1 decision tree software. We compare the performance of the two classifiers on four datasets, two artificial and two real, finding that the hierarchical mixture-of-experts classifier achieves superior classification performance on the testing data.","PeriodicalId":385351,"journal":{"name":"Final Program and Abstracts on Information, Decision and Control","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Final Program and Abstracts on Information, Decision and Control","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IDC.2002.995419","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Statistical mixture-of-experts models are often used for data analysis tasks such as clustering, regression and classification. We consider two mixture-of-experts models, the shared mixture classifier and the hierarchical mixture-of-experts classifier. We discuss the initialisation and optimisation of the structure and parameters of each classifier. In particular, we initialise the hierarchical mixture of experts classifier with the public domain OC1 decision tree software. We compare the performance of the two classifiers on four datasets, two artificial and two real, finding that the hierarchical mixture-of-experts classifier achieves superior classification performance on the testing data.