D. Shutin, Haipeng Zheng, Bernard H. Fleury, Sanjeev R. Kulkarni, H. V. Poor
{"title":"Space-alternating attribute-distributed sparse learning","authors":"D. Shutin, Haipeng Zheng, Bernard H. Fleury, Sanjeev R. Kulkarni, H. V. Poor","doi":"10.1109/CIP.2010.5604254","DOIUrl":null,"url":null,"abstract":"The paper proposes a new variational Bayesian algorithm for multivariate regression with attribute-distributed or dimensionally distributed data. Compared to the existing approaches the proposed algorithm exploits the variational version of the Space-Alternating Generalized Expectation-Maximization (SAGE) algorithm that by means of admissible hidden data - an analog of the complete data in the EM framework - allows parameters of a single agent to be updated assuming that parameters of the other agents are fixed. This allows learning to be implemented in a distributed fashion by sequentially updating the agents one after another. Inspired by Bayesian sparsity techniques, the algorithm also introduces constraints on the agent parameters via parametric priors. This adds a mechanism for pruning irrelevant agents, as well as for minimizing the effect of overfitting. Using synthetic data, as well as measurement data from the UCI Machine Learning Repository it is demonstrated that the proposed algorithm outperforms existing solutions both in the achieved mean-square error (MSE), as well as in convergence speed due to the ability to sparsify noninformative agents, while at the same time allowing distributed implementation and flexible agent update protocols.","PeriodicalId":171474,"journal":{"name":"2010 2nd International Workshop on Cognitive Information Processing","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 2nd International Workshop on Cognitive Information Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIP.2010.5604254","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
The paper proposes a new variational Bayesian algorithm for multivariate regression with attribute-distributed or dimensionally distributed data. Compared to the existing approaches the proposed algorithm exploits the variational version of the Space-Alternating Generalized Expectation-Maximization (SAGE) algorithm that by means of admissible hidden data - an analog of the complete data in the EM framework - allows parameters of a single agent to be updated assuming that parameters of the other agents are fixed. This allows learning to be implemented in a distributed fashion by sequentially updating the agents one after another. Inspired by Bayesian sparsity techniques, the algorithm also introduces constraints on the agent parameters via parametric priors. This adds a mechanism for pruning irrelevant agents, as well as for minimizing the effect of overfitting. Using synthetic data, as well as measurement data from the UCI Machine Learning Repository it is demonstrated that the proposed algorithm outperforms existing solutions both in the achieved mean-square error (MSE), as well as in convergence speed due to the ability to sparsify noninformative agents, while at the same time allowing distributed implementation and flexible agent update protocols.