A. S. Bedi, Alec Koppel, Brian M. Sadler, V. Elvira
{"title":"有效表示局部分布的压缩流重要性采样","authors":"A. S. Bedi, Alec Koppel, Brian M. Sadler, V. Elvira","doi":"10.1109/IEEECONF44664.2019.9048744","DOIUrl":null,"url":null,"abstract":"Importance sampling (IS) is the standard Monte Carlo tool to compute integrals involving random variables such as their mean or higher-order moments. This procedure permits localizing a source signal corrupted by observation noise whose distribution is arbitrary, in contrast to typical Gaussian assumptions. We note that IS is asymptotically consistent as the number of Monte Carlo samples, and hence Dirac deltas (particles) that parameterize the density estimate, go to infinity. Unfortunately, allowing the number of particles in the density approximation to go to infinity is computationally intractable. Here we present a methodology for only keeping a finite representative subset of particles and their augmented importance weights that is nearly statistically consistent. To do so, we approximate importance sampling in two ways: we (1) replace the deltas by kernels, yielding kernel density estimates; (2) and sequentially project the kernel density estimates onto nearby lower-dimensional subspaces. Theoretically, the asymptotic bias of this scheme is characterized by a compression parameter and the kernel bandwidth, which yields a tunable trade-off between statistical consistency and memory. We then evaluate the validity of the proposed approach for a localization problem in wireless systems, and observed that the proposed algorithm and yields a favorable trade-off between memory and accuracy.","PeriodicalId":6684,"journal":{"name":"2019 53rd Asilomar Conference on Signals, Systems, and Computers","volume":"10 1","pages":"477-481"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Compressed Streaming Importance Sampling for Efficient Representations of Localization Distributions\",\"authors\":\"A. S. Bedi, Alec Koppel, Brian M. Sadler, V. Elvira\",\"doi\":\"10.1109/IEEECONF44664.2019.9048744\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Importance sampling (IS) is the standard Monte Carlo tool to compute integrals involving random variables such as their mean or higher-order moments. This procedure permits localizing a source signal corrupted by observation noise whose distribution is arbitrary, in contrast to typical Gaussian assumptions. We note that IS is asymptotically consistent as the number of Monte Carlo samples, and hence Dirac deltas (particles) that parameterize the density estimate, go to infinity. Unfortunately, allowing the number of particles in the density approximation to go to infinity is computationally intractable. Here we present a methodology for only keeping a finite representative subset of particles and their augmented importance weights that is nearly statistically consistent. To do so, we approximate importance sampling in two ways: we (1) replace the deltas by kernels, yielding kernel density estimates; (2) and sequentially project the kernel density estimates onto nearby lower-dimensional subspaces. Theoretically, the asymptotic bias of this scheme is characterized by a compression parameter and the kernel bandwidth, which yields a tunable trade-off between statistical consistency and memory. We then evaluate the validity of the proposed approach for a localization problem in wireless systems, and observed that the proposed algorithm and yields a favorable trade-off between memory and accuracy.\",\"PeriodicalId\":6684,\"journal\":{\"name\":\"2019 53rd Asilomar Conference on Signals, Systems, and Computers\",\"volume\":\"10 1\",\"pages\":\"477-481\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 53rd Asilomar Conference on Signals, Systems, and Computers\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IEEECONF44664.2019.9048744\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 53rd Asilomar Conference on Signals, Systems, and Computers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IEEECONF44664.2019.9048744","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Compressed Streaming Importance Sampling for Efficient Representations of Localization Distributions
Importance sampling (IS) is the standard Monte Carlo tool to compute integrals involving random variables such as their mean or higher-order moments. This procedure permits localizing a source signal corrupted by observation noise whose distribution is arbitrary, in contrast to typical Gaussian assumptions. We note that IS is asymptotically consistent as the number of Monte Carlo samples, and hence Dirac deltas (particles) that parameterize the density estimate, go to infinity. Unfortunately, allowing the number of particles in the density approximation to go to infinity is computationally intractable. Here we present a methodology for only keeping a finite representative subset of particles and their augmented importance weights that is nearly statistically consistent. To do so, we approximate importance sampling in two ways: we (1) replace the deltas by kernels, yielding kernel density estimates; (2) and sequentially project the kernel density estimates onto nearby lower-dimensional subspaces. Theoretically, the asymptotic bias of this scheme is characterized by a compression parameter and the kernel bandwidth, which yields a tunable trade-off between statistical consistency and memory. We then evaluate the validity of the proposed approach for a localization problem in wireless systems, and observed that the proposed algorithm and yields a favorable trade-off between memory and accuracy.