{"title":"传感器网络的分布式变分稀疏贝叶斯学习","authors":"Thomas Buchgraber, D. Shutin","doi":"10.1109/MLSP.2012.6349800","DOIUrl":null,"url":null,"abstract":"In this work we present a distributed sparse Bayesian learning (dSBL) regression algorithm. It can be used for collaborative sparse estimation of spatial functions in wireless sensor networks (WSNs). The sensor measurements are modeled as a weighted superposition of basis functions. When kernels are used, the algorithm forms a distributed version of the relevance vector machine. The proposed method is based on a combination of variational inference and loopy belief propagation, where data is only communicated between neighboring nodes without the need for a fusion center. We show that for tree structured networks, under certain parameterization, dSBL coincides with centralized sparse Bayesian learning (cSBL). For general loopy networks, dSBL and cSBL are differend, yet simulations show much faster convergence over the variational inference iterations at similar sparsity and mean squared error performance. Furthermore, compared to other sparse distributed regression methods, our method does not require any cross-tuning of sparsity parameters.","PeriodicalId":262601,"journal":{"name":"2012 IEEE International Workshop on Machine Learning for Signal Processing","volume":"99 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Distributed variational sparse Bayesian learning for sensor networks\",\"authors\":\"Thomas Buchgraber, D. Shutin\",\"doi\":\"10.1109/MLSP.2012.6349800\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this work we present a distributed sparse Bayesian learning (dSBL) regression algorithm. It can be used for collaborative sparse estimation of spatial functions in wireless sensor networks (WSNs). The sensor measurements are modeled as a weighted superposition of basis functions. When kernels are used, the algorithm forms a distributed version of the relevance vector machine. The proposed method is based on a combination of variational inference and loopy belief propagation, where data is only communicated between neighboring nodes without the need for a fusion center. We show that for tree structured networks, under certain parameterization, dSBL coincides with centralized sparse Bayesian learning (cSBL). For general loopy networks, dSBL and cSBL are differend, yet simulations show much faster convergence over the variational inference iterations at similar sparsity and mean squared error performance. Furthermore, compared to other sparse distributed regression methods, our method does not require any cross-tuning of sparsity parameters.\",\"PeriodicalId\":262601,\"journal\":{\"name\":\"2012 IEEE International Workshop on Machine Learning for Signal Processing\",\"volume\":\"99 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-11-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 IEEE International Workshop on Machine Learning for Signal Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MLSP.2012.6349800\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE International Workshop on Machine Learning for Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MLSP.2012.6349800","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Distributed variational sparse Bayesian learning for sensor networks
In this work we present a distributed sparse Bayesian learning (dSBL) regression algorithm. It can be used for collaborative sparse estimation of spatial functions in wireless sensor networks (WSNs). The sensor measurements are modeled as a weighted superposition of basis functions. When kernels are used, the algorithm forms a distributed version of the relevance vector machine. The proposed method is based on a combination of variational inference and loopy belief propagation, where data is only communicated between neighboring nodes without the need for a fusion center. We show that for tree structured networks, under certain parameterization, dSBL coincides with centralized sparse Bayesian learning (cSBL). For general loopy networks, dSBL and cSBL are differend, yet simulations show much faster convergence over the variational inference iterations at similar sparsity and mean squared error performance. Furthermore, compared to other sparse distributed regression methods, our method does not require any cross-tuning of sparsity parameters.