{"title":"Communicating the Difference of Correlated Gaussian Sources over a MAC","authors":"R. Soundararajan, S. Vishwanath","doi":"10.1109/DCC.2009.17","DOIUrl":"https://doi.org/10.1109/DCC.2009.17","url":null,"abstract":"This paper considers the problem of transmitting the difference of two positively correlated Gaussian sources over a two-user additive Gaussian noise multiple access channel (MAC). The goal is to recover this difference within an average mean squared error distortion criterion. Each transmitter has access to only one of the two Gaussian sources and is limited by an average power constraint. In this work, a lattice coding scheme that achieves a distortion within a constant of a distortion lower bound is presented if the signal to noise ratio (SNR) is greater than a threshold. Further, uncoded transmission is shown to be worse in performance to lattice coding methods for correlation coefficients above a threshold. An alternative lattice coding scheme is also presented that can potentially improve on the performance of uncoded transmission.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114288191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Low-Memory Adaptive Prefix Coding","authors":"T. Gagie, Marek Karpinski, Yakov Nekrich","doi":"10.1109/DCC.2009.61","DOIUrl":"https://doi.org/10.1109/DCC.2009.61","url":null,"abstract":"In this paper we study the adaptive prefix coding problem in cases where the size of the input alphabet is large. We present an online prefix coding algorithm that uses $O(sigma^{1 / lambda + epsilon}) $ bits of space for any constants $eps≫0$, $lambda≫1$, and encodes the string of symbols in $O(log log sigma)$ time per symbol emph{in the worst case}, where $sigma$ is the size of the alphabet. The upper bound on the encoding length is $lambda n H (s) +(lambda / ln 2 + 2 + epsilon) n + O (sigma^{1 / lambda} log^2 sigma)$ bits.","PeriodicalId":377880,"journal":{"name":"2009 Data Compression Conference","volume":"84 11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-11-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130067961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}