{"title":"Gradient flows on projection matrices for subspace estimation","authors":"A. Srivastava, D. Fuhrmann","doi":"10.1109/ACSSC.1997.679117","DOIUrl":null,"url":null,"abstract":"Estimation of dynamic subspaces is important in blind-channel identification for multiuser wireless communications and active computer vision. Mathematically, a subspace can either be parameterized non-uniquely by a linearly-independent basis, or uniquely, by a projection matrix. We present a stochastic gradient technique for optimization on projective representations of subspaces. This technique is intrinsic, i.e. it utilizes the geometry of underlying parameter space (Grassman manifold) and constructs gradient flows on the manifold for local optimization. The addition of a stochastic component to the search process guarantees global minima and a discrete jump component allows for uncertainty in rank of the subspace (simultaneous model order estimation).","PeriodicalId":240431,"journal":{"name":"Conference Record of the Thirty-First Asilomar Conference on Signals, Systems and Computers (Cat. No.97CB36136)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1997-11-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference Record of the Thirty-First Asilomar Conference on Signals, Systems and Computers (Cat. No.97CB36136)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACSSC.1997.679117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Estimation of dynamic subspaces is important in blind-channel identification for multiuser wireless communications and active computer vision. Mathematically, a subspace can either be parameterized non-uniquely by a linearly-independent basis, or uniquely, by a projection matrix. We present a stochastic gradient technique for optimization on projective representations of subspaces. This technique is intrinsic, i.e. it utilizes the geometry of underlying parameter space (Grassman manifold) and constructs gradient flows on the manifold for local optimization. The addition of a stochastic component to the search process guarantees global minima and a discrete jump component allows for uncertainty in rank of the subspace (simultaneous model order estimation).