Review of Dimension Reduction Methods

S. Nanga, A. T. Bawah, Ben Acquaye, Mac-Issaka Billa, Francisco Baeta, N. Odai, Samuel Kwaku Obeng, Ampem Darko Nsiah
{"title":"Review of Dimension Reduction Methods","authors":"S. Nanga, A. T. Bawah, Ben Acquaye, Mac-Issaka Billa, Francisco Baeta, N. Odai, Samuel Kwaku Obeng, Ampem Darko Nsiah","doi":"10.4236/jdaip.2021.93013","DOIUrl":null,"url":null,"abstract":"Purpose: This study sought to review the characteristics, strengths, weaknesses \nvariants, applications areas and data types applied on the various Dimension Reduction techniques. Methodology: The \nmost commonly used databases employed to search for the papers were ScienceDirect, \nScopus, Google Scholar, IEEE Xplore and Mendeley. An integrative review was \nused for the study where 341 papers were reviewed. Results: The linear \ntechniques considered were Principal Component Analysis (PCA), Linear Discriminant \nAnalysis (LDA), Singular Value Decomposition (SVD), Latent Semantic Analysis \n(LSA), Locality Preserving Projections (LPP), Independent Component Analysis \n(ICA) and Project Pursuit (PP). The non-linear techniques which were developed \nto work with applications that have complex non-linear structures considered were Kernel Principal Component \nAnalysis (KPCA), Multi-dimensional \nScaling (MDS), Isomap, Locally Linear Embedding (LLE), Self-Organizing Map \n(SOM), Latent Vector Quantization (LVQ), t-Stochastic neighbor embedding (t-SNE) and Uniform Manifold Approximation and \nProjection (UMAP). DR techniques can further be categorized into supervised, \nunsupervised and more recently semi-supervised learning methods. The supervised \nversions are the LDA and LVQ. All the other techniques are unsupervised. \nSupervised variants of PCA, LPP, KPCA and MDS have been developed. \nSupervised and semi-supervised variants of PP and t-SNE have also been \ndeveloped and a semi supervised version of the LDA has been developed. Conclusion: The various application areas, strengths, weaknesses and variants of the DR \ntechniques were explored. The different data types that have been applied on \nthe various DR techniques were also explored.","PeriodicalId":71434,"journal":{"name":"数据分析和信息处理(英文)","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"19","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"数据分析和信息处理(英文)","FirstCategoryId":"1093","ListUrlMain":"https://doi.org/10.4236/jdaip.2021.93013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 19

Abstract

Purpose: This study sought to review the characteristics, strengths, weaknesses variants, applications areas and data types applied on the various Dimension Reduction techniques. Methodology: The most commonly used databases employed to search for the papers were ScienceDirect, Scopus, Google Scholar, IEEE Xplore and Mendeley. An integrative review was used for the study where 341 papers were reviewed. Results: The linear techniques considered were Principal Component Analysis (PCA), Linear Discriminant Analysis (LDA), Singular Value Decomposition (SVD), Latent Semantic Analysis (LSA), Locality Preserving Projections (LPP), Independent Component Analysis (ICA) and Project Pursuit (PP). The non-linear techniques which were developed to work with applications that have complex non-linear structures considered were Kernel Principal Component Analysis (KPCA), Multi-dimensional Scaling (MDS), Isomap, Locally Linear Embedding (LLE), Self-Organizing Map (SOM), Latent Vector Quantization (LVQ), t-Stochastic neighbor embedding (t-SNE) and Uniform Manifold Approximation and Projection (UMAP). DR techniques can further be categorized into supervised, unsupervised and more recently semi-supervised learning methods. The supervised versions are the LDA and LVQ. All the other techniques are unsupervised. Supervised variants of PCA, LPP, KPCA and MDS have been developed. Supervised and semi-supervised variants of PP and t-SNE have also been developed and a semi supervised version of the LDA has been developed. Conclusion: The various application areas, strengths, weaknesses and variants of the DR techniques were explored. The different data types that have been applied on the various DR techniques were also explored.
尺寸缩减方法综述
目的:本研究旨在审查各种降维技术的特点、优势、弱点、应用领域和数据类型。方法:用于搜索论文的最常用数据库是ScienceDirect、Scopus、Google Scholar、IEEE Xplore和Mendeley。该研究采用了综合综述法,共回顾了341篇论文。结果:所考虑的线性技术有主成分分析(PCA)、线性判别分析(LDA)、奇异值分解(SVD)、潜在语义分析(LSA)、局部保持投影(LPP)、独立成分分析(ICA)和项目追求(PP)。为处理考虑了复杂非线性结构的应用而开发的非线性技术有核主成分分析(KPCA)、多维标度(MDS)、Isomap、局部线性嵌入(LLE)、自组织映射(SOM)、潜在矢量量化(LVQ),t-随机邻域嵌入(t-SNE)和均匀流形逼近与投影(UMAP)。DR技术可以进一步分为有监督、无监督和最近的半监督学习方法。监督版本为LDA和LVQ。所有其他技术都是无人监督的。PCA、LPP、KPCA和MDS的监督变体已经开发出来。PP和t-SNE的监督和半监督变体也已开发,LDA的半监督版本也已开发。结论:探讨了DR技术的各种应用领域、优缺点和变体。还探讨了应用于各种DR技术的不同数据类型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
91
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信