Supervised latent linear Gaussian process latent variable model for dimensionality reduction.

Xinwei Jiang, Junbin Gao, Tianjiang Wang, Lihong Zheng
{"title":"Supervised latent linear Gaussian process latent variable model for dimensionality reduction.","authors":"Xinwei Jiang,&nbsp;Junbin Gao,&nbsp;Tianjiang Wang,&nbsp;Lihong Zheng","doi":"10.1109/TSMCB.2012.2196995","DOIUrl":null,"url":null,"abstract":"<p><p>The Gaussian process (GP) latent variable model (GPLVM) has the capability of learning low-dimensional manifold from highly nonlinear data of high dimensionality. As an unsupervised dimensionality reduction (DR) algorithm, the GPLVM has been successfully applied in many areas. However, in its current setting, GPLVM is unable to use label information, which is available for many tasks; therefore, researchers proposed many kinds of extensions to the GPLVM in order to utilize extra information, among which the supervised GPLVM (SGPLVM) has shown better performance compared with other SGPLVM extensions. However, the SGPLVM suffers in its high computational complexity. Bearing in mind the issues of the complexity and the need of incorporating additionally available information, in this paper, we propose a novel SGPLVM, called supervised latent linear GPLVM (SLLGPLVM). Our approach is motivated by both SGPLVM and supervised probabilistic principal component analysis (SPPCA). The proposed SLLGPLVM can be viewed as an appropriate compromise between the SGPLVM and the SPPCA. Furthermore, it is also appropriate to interpret the SLLGPLVM as a semiparametric regression model for supervised DR by making use of the GP to model the unknown smooth link function. Complexity analysis and experiments show that the developed SLLGPLVM outperforms the SGPLVM not only in the computational complexity but also in its accuracy. We also compared the SLLGPLVM with two classical supervised classifiers, i.e., a GP classifier and a support vector machine, to illustrate the advantages of the proposed model.</p>","PeriodicalId":55006,"journal":{"name":"IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2012-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/TSMCB.2012.2196995","citationCount":"32","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Systems Man and Cybernetics Part B-Cybernetics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TSMCB.2012.2196995","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2012/5/17 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 32

Abstract

The Gaussian process (GP) latent variable model (GPLVM) has the capability of learning low-dimensional manifold from highly nonlinear data of high dimensionality. As an unsupervised dimensionality reduction (DR) algorithm, the GPLVM has been successfully applied in many areas. However, in its current setting, GPLVM is unable to use label information, which is available for many tasks; therefore, researchers proposed many kinds of extensions to the GPLVM in order to utilize extra information, among which the supervised GPLVM (SGPLVM) has shown better performance compared with other SGPLVM extensions. However, the SGPLVM suffers in its high computational complexity. Bearing in mind the issues of the complexity and the need of incorporating additionally available information, in this paper, we propose a novel SGPLVM, called supervised latent linear GPLVM (SLLGPLVM). Our approach is motivated by both SGPLVM and supervised probabilistic principal component analysis (SPPCA). The proposed SLLGPLVM can be viewed as an appropriate compromise between the SGPLVM and the SPPCA. Furthermore, it is also appropriate to interpret the SLLGPLVM as a semiparametric regression model for supervised DR by making use of the GP to model the unknown smooth link function. Complexity analysis and experiments show that the developed SLLGPLVM outperforms the SGPLVM not only in the computational complexity but also in its accuracy. We also compared the SLLGPLVM with two classical supervised classifiers, i.e., a GP classifier and a support vector machine, to illustrate the advantages of the proposed model.

监督隐线性高斯过程隐变量降维模型。
高斯过程潜变量模型(GPLVM)具有从高维高度非线性数据中学习低维流形的能力。作为一种无监督降维算法,GPLVM在许多领域得到了成功的应用。然而,在当前的设置中,GPLVM无法使用标签信息,而标签信息可用于许多任务;因此,研究人员对GPLVM提出了多种扩展,以利用额外的信息,其中有监督的GPLVM (SGPLVM)与其他SGPLVM扩展相比表现出更好的性能。然而,SGPLVM的缺点是计算复杂度高。考虑到复杂性和纳入额外可用信息的需求,在本文中,我们提出了一种新的SGPLVM,称为监督潜在线性GPLVM (SLLGPLVM)。我们的方法是由SGPLVM和监督概率主成分分析(SPPCA)驱动的。提出的SLLGPLVM可以看作是SGPLVM和SPPCA之间的一个适当折衷。此外,利用GP对未知平滑链接函数进行建模,也可以将SLLGPLVM解释为监督DR的半参数回归模型。复杂性分析和实验表明,所开发的SLLGPLVM不仅在计算复杂度上优于SGPLVM,而且在精度上优于SGPLVM。我们还将SLLGPLVM与两个经典的监督分类器(即GP分类器和支持向量机)进行了比较,以说明所提出模型的优点。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
审稿时长
6.0 months
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信