Yuheng Jia, S. Kwong, Wenhui Wu, Wei Gao, Ran Wang
{"title":"Generalized relevance vector machine","authors":"Yuheng Jia, S. Kwong, Wenhui Wu, Wei Gao, Ran Wang","doi":"10.1109/INTELLISYS.2017.8324361","DOIUrl":null,"url":null,"abstract":"This paper considers the generalized version of relevance vector machine (RVM), which is a sparse Bayesian kernel machine for classification and ordinary regression. Generalized RVM (GRVM) follows the work of generalized linear model (GLM), which is a natural generalization of ordinary linear regression model and shares a common approach to estimate the parameters. GRVM inherits the advantages of GLM, i.e., unified model structure, same training algorithm, and convenient task-specific model design. It also inherits the advantages of RVM, i.e., probabilistic output, extremely sparse solution, hyperparameter auto-estimation. Besides, GRVM extends RVM to a wider range of learning tasks beyond classification and ordinary regression by assuming that the conditional output belongs to exponential family distribution (EFD). Since EFD results in inference intractable problem in Bayesian analysis, in this paper, Laplace approximation is adopted to solve this problem, which is a common approach in Bayesian inference. Further, several task-specific models are designed based on GRVM including models for ordinary regression, count data regression, classification, ordinal regression, etc. Besides, the relationship between GRVM and traditional RVM models are discussed. Finally, experimental results show the efficiency of the proposed GRVM model.","PeriodicalId":131825,"journal":{"name":"2017 Intelligent Systems Conference (IntelliSys)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 Intelligent Systems Conference (IntelliSys)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INTELLISYS.2017.8324361","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
This paper considers the generalized version of relevance vector machine (RVM), which is a sparse Bayesian kernel machine for classification and ordinary regression. Generalized RVM (GRVM) follows the work of generalized linear model (GLM), which is a natural generalization of ordinary linear regression model and shares a common approach to estimate the parameters. GRVM inherits the advantages of GLM, i.e., unified model structure, same training algorithm, and convenient task-specific model design. It also inherits the advantages of RVM, i.e., probabilistic output, extremely sparse solution, hyperparameter auto-estimation. Besides, GRVM extends RVM to a wider range of learning tasks beyond classification and ordinary regression by assuming that the conditional output belongs to exponential family distribution (EFD). Since EFD results in inference intractable problem in Bayesian analysis, in this paper, Laplace approximation is adopted to solve this problem, which is a common approach in Bayesian inference. Further, several task-specific models are designed based on GRVM including models for ordinary regression, count data regression, classification, ordinal regression, etc. Besides, the relationship between GRVM and traditional RVM models are discussed. Finally, experimental results show the efficiency of the proposed GRVM model.