{"title":"Reducing the Computational Cost of Bayesian Indoor Positioning Systems","authors":"Konstantinos Kleisouris, R. Martin","doi":"10.1109/SAHCN.2006.288512","DOIUrl":null,"url":null,"abstract":"In this work we show how to reduce the computational cost of using Bayesian networks for localization. We investigate a range of Monte Carlo sampling strategies, including Gibbs and Metropolis. We found that for our Gibbs samplers, most of the time is spent in slice sampling. Moreover, our results show that although uniform sampling over the entire domain suffers occasional rejections, it has a much lower overall computational cost than approaches that carefully avoid rejections. The key reason for this efficiency is the flatness of the full conditionals in our localization networks. Our sampling technique is also attractive because it does not require extensive tuning to achieve good performance, unlike the Metropolis samplers. We demonstrate that our whole domain sampling technique converges accurately with low latency. On commodity hardware our sampler localizes up to 10 points in less than half a second, which is over 10 times faster than a common general-purpose Bayesian sampler. Our sampler also scales well, localizing 51 objects with no location information in the training set in less than 6 seconds. Finally, we present an analytic model that describes the number of evaluations per variable using slice sampling. The model allows us to analytically determine how flat a distribution should be so that whole domain sampling is computationally more efficient when compared to other methods","PeriodicalId":58925,"journal":{"name":"Digital Communications and Networks","volume":"91 1","pages":"555-564"},"PeriodicalIF":0.0000,"publicationDate":"2006-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Digital Communications and Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SAHCN.2006.288512","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13
Abstract
In this work we show how to reduce the computational cost of using Bayesian networks for localization. We investigate a range of Monte Carlo sampling strategies, including Gibbs and Metropolis. We found that for our Gibbs samplers, most of the time is spent in slice sampling. Moreover, our results show that although uniform sampling over the entire domain suffers occasional rejections, it has a much lower overall computational cost than approaches that carefully avoid rejections. The key reason for this efficiency is the flatness of the full conditionals in our localization networks. Our sampling technique is also attractive because it does not require extensive tuning to achieve good performance, unlike the Metropolis samplers. We demonstrate that our whole domain sampling technique converges accurately with low latency. On commodity hardware our sampler localizes up to 10 points in less than half a second, which is over 10 times faster than a common general-purpose Bayesian sampler. Our sampler also scales well, localizing 51 objects with no location information in the training set in less than 6 seconds. Finally, we present an analytic model that describes the number of evaluations per variable using slice sampling. The model allows us to analytically determine how flat a distribution should be so that whole domain sampling is computationally more efficient when compared to other methods