{"title":"分布式函数编码的超级分组","authors":"Derya Malak, M. Médard","doi":"10.1109/spawc48557.2020.9154232","DOIUrl":null,"url":null,"abstract":"We consider the distributed source encoding problem with 2 correlated sources X1 and X2 and a destination that seeks the outcome of a continuous function f(X1, X2). We develop a compression scheme called hyper binning in order to quantize f. Hyper binning is a natural generalization of Cover’s random code construction for the asymptotically optimal Slepian-Wolf encoding scheme that makes use of binning. The key idea behind this approach is to use linear discriminant analysis in order to characterize different source feature combinations. This scheme captures the correlation between the sources and function’s structure as a means of dimensionality reduction. We investigate the performance of hyper binning for different source distributions, and identify which classes of sources entail more partitioning to achieve better function approximation.","PeriodicalId":172835,"journal":{"name":"2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)","volume":"96 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Hyper Binning for Distributed Function Coding\",\"authors\":\"Derya Malak, M. Médard\",\"doi\":\"10.1109/spawc48557.2020.9154232\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We consider the distributed source encoding problem with 2 correlated sources X1 and X2 and a destination that seeks the outcome of a continuous function f(X1, X2). We develop a compression scheme called hyper binning in order to quantize f. Hyper binning is a natural generalization of Cover’s random code construction for the asymptotically optimal Slepian-Wolf encoding scheme that makes use of binning. The key idea behind this approach is to use linear discriminant analysis in order to characterize different source feature combinations. This scheme captures the correlation between the sources and function’s structure as a means of dimensionality reduction. We investigate the performance of hyper binning for different source distributions, and identify which classes of sources entail more partitioning to achieve better function approximation.\",\"PeriodicalId\":172835,\"journal\":{\"name\":\"2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)\",\"volume\":\"96 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/spawc48557.2020.9154232\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/spawc48557.2020.9154232","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We consider the distributed source encoding problem with 2 correlated sources X1 and X2 and a destination that seeks the outcome of a continuous function f(X1, X2). We develop a compression scheme called hyper binning in order to quantize f. Hyper binning is a natural generalization of Cover’s random code construction for the asymptotically optimal Slepian-Wolf encoding scheme that makes use of binning. The key idea behind this approach is to use linear discriminant analysis in order to characterize different source feature combinations. This scheme captures the correlation between the sources and function’s structure as a means of dimensionality reduction. We investigate the performance of hyper binning for different source distributions, and identify which classes of sources entail more partitioning to achieve better function approximation.