{"title":"基于LEML的SPD矩阵鲁棒软LVQ","authors":"Fengzhen Tang;Xiaocheng Zhang","doi":"10.1109/TETCI.2024.3515009","DOIUrl":null,"url":null,"abstract":"Many learning scenarios involve non-Euclidean data. For instance, in electroencephalogram (EEG) or image classification, the input can be characterized by symmetric positive-definite (SPD) matrices. These matrices live on the curved Riemannian manifold instead of the flat Euclidean space. In such situations, classical Euclidean learning methods may fail due to the non-Euclidean nature of the data. This article proposes to generalize robust soft learning vector quantization (RSLVQ) targeted for Euclidean data to cope with such data in the log-Euclidean framework. Log-Euclidean metric learning (LEML) is incorporated into the RSLVQ framework, jointly learning the prototypes on the manifold and the distance metric tensor of the tangent map that projects the original tangent space to a more discriminative one. Two methods are subsequently proposed to learn the distance metric tensor. It is firstly confined to have full rank and treated as an SPD matrix, which is learned using the log-Euclidean framework. Then, this constraint is removed letting it become a symmetric positive semidefinite (SPSD) matrix, which is learned using the quotient geometry. In addition, we propose to adapt the variance parameter introduced in the probabilistic modelling during the training course of the classifier by the minimization of the negative log likelihood function with respect to this parameter. Experiments on multiple data sets with different properties show the proposed methods have good classification performances and low computational complexities.","PeriodicalId":13135,"journal":{"name":"IEEE Transactions on Emerging Topics in Computational Intelligence","volume":"9 4","pages":"2995-3009"},"PeriodicalIF":5.3000,"publicationDate":"2024-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10816565","citationCount":"0","resultStr":"{\"title\":\"Robust Soft LVQ With LEML for SPD Matrices\",\"authors\":\"Fengzhen Tang;Xiaocheng Zhang\",\"doi\":\"10.1109/TETCI.2024.3515009\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many learning scenarios involve non-Euclidean data. For instance, in electroencephalogram (EEG) or image classification, the input can be characterized by symmetric positive-definite (SPD) matrices. These matrices live on the curved Riemannian manifold instead of the flat Euclidean space. In such situations, classical Euclidean learning methods may fail due to the non-Euclidean nature of the data. This article proposes to generalize robust soft learning vector quantization (RSLVQ) targeted for Euclidean data to cope with such data in the log-Euclidean framework. Log-Euclidean metric learning (LEML) is incorporated into the RSLVQ framework, jointly learning the prototypes on the manifold and the distance metric tensor of the tangent map that projects the original tangent space to a more discriminative one. Two methods are subsequently proposed to learn the distance metric tensor. It is firstly confined to have full rank and treated as an SPD matrix, which is learned using the log-Euclidean framework. Then, this constraint is removed letting it become a symmetric positive semidefinite (SPSD) matrix, which is learned using the quotient geometry. In addition, we propose to adapt the variance parameter introduced in the probabilistic modelling during the training course of the classifier by the minimization of the negative log likelihood function with respect to this parameter. Experiments on multiple data sets with different properties show the proposed methods have good classification performances and low computational complexities.\",\"PeriodicalId\":13135,\"journal\":{\"name\":\"IEEE Transactions on Emerging Topics in Computational Intelligence\",\"volume\":\"9 4\",\"pages\":\"2995-3009\"},\"PeriodicalIF\":5.3000,\"publicationDate\":\"2024-12-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10816565\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Emerging Topics in Computational Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10816565/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10816565/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Many learning scenarios involve non-Euclidean data. For instance, in electroencephalogram (EEG) or image classification, the input can be characterized by symmetric positive-definite (SPD) matrices. These matrices live on the curved Riemannian manifold instead of the flat Euclidean space. In such situations, classical Euclidean learning methods may fail due to the non-Euclidean nature of the data. This article proposes to generalize robust soft learning vector quantization (RSLVQ) targeted for Euclidean data to cope with such data in the log-Euclidean framework. Log-Euclidean metric learning (LEML) is incorporated into the RSLVQ framework, jointly learning the prototypes on the manifold and the distance metric tensor of the tangent map that projects the original tangent space to a more discriminative one. Two methods are subsequently proposed to learn the distance metric tensor. It is firstly confined to have full rank and treated as an SPD matrix, which is learned using the log-Euclidean framework. Then, this constraint is removed letting it become a symmetric positive semidefinite (SPSD) matrix, which is learned using the quotient geometry. In addition, we propose to adapt the variance parameter introduced in the probabilistic modelling during the training course of the classifier by the minimization of the negative log likelihood function with respect to this parameter. Experiments on multiple data sets with different properties show the proposed methods have good classification performances and low computational complexities.
期刊介绍:
The IEEE Transactions on Emerging Topics in Computational Intelligence (TETCI) publishes original articles on emerging aspects of computational intelligence, including theory, applications, and surveys.
TETCI is an electronics only publication. TETCI publishes six issues per year.
Authors are encouraged to submit manuscripts in any emerging topic in computational intelligence, especially nature-inspired computing topics not covered by other IEEE Computational Intelligence Society journals. A few such illustrative examples are glial cell networks, computational neuroscience, Brain Computer Interface, ambient intelligence, non-fuzzy computing with words, artificial life, cultural learning, artificial endocrine networks, social reasoning, artificial hormone networks, computational intelligence for the IoT and Smart-X technologies.