{"title":"Mahalanobis distance on Grassmann manifold and its application to brain signal processing","authors":"Y. Washizawa, S. Hotta","doi":"10.1109/MLSP.2012.6349723","DOIUrl":null,"url":null,"abstract":"Multi-dimensional data such as image patterns, image sequences, and brain signals, are often given in the form of the variance-covariance matrices or their eigenspaces to represent their own variations. For example, in face or object recognition problems, variations due to illuminations, camera angles can be represented by eigenspaces. A set of the eigenspaces is called the Grassmann manifold, and simple distance measurements in the Grassmann manifold, such as the projection metric have been used in conventional researches. However, in linear spaces, if the distribution of patterns is not isotropic, statistical distances such as the Mahalanobis distance are reasonable, and their performances are higher than simple distances in many problems. In this paper, we introduce the Mahalanobis distance in the Grassmann manifolds. Two experimental results, an object recognition problem and a brain signal processing, demonstrate the advantages of the proposed distance measurement.","PeriodicalId":262601,"journal":{"name":"2012 IEEE International Workshop on Machine Learning for Signal Processing","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE International Workshop on Machine Learning for Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MLSP.2012.6349723","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Multi-dimensional data such as image patterns, image sequences, and brain signals, are often given in the form of the variance-covariance matrices or their eigenspaces to represent their own variations. For example, in face or object recognition problems, variations due to illuminations, camera angles can be represented by eigenspaces. A set of the eigenspaces is called the Grassmann manifold, and simple distance measurements in the Grassmann manifold, such as the projection metric have been used in conventional researches. However, in linear spaces, if the distribution of patterns is not isotropic, statistical distances such as the Mahalanobis distance are reasonable, and their performances are higher than simple distances in many problems. In this paper, we introduce the Mahalanobis distance in the Grassmann manifolds. Two experimental results, an object recognition problem and a brain signal processing, demonstrate the advantages of the proposed distance measurement.