{"title":"Learning Multi-Rate Vector Quantization for Remote Deep Inference","authors":"M. Malka, Shai Ginzach, Nir Shlezinger","doi":"10.1109/ICASSPW59220.2023.10193526","DOIUrl":null,"url":null,"abstract":"Remote inference accommodates a broad range of scenarios, where inference is carried out using data acquired at a remote user. When the sensing and inferring users communicate over rate limited channels, compression of the data reduces latency, and deep learning enables to jointly learn the compression encoding along with the inference rule. However, because the data is compressed into a fixed number of bits, the resolution cannot be adapted to changes in channel conditions. In this work we propose a multi-rate remote deep inference scheme, which trains a single encoder-decoder model that uses learned vector quantizers while supporting different quantization levels. Our scheme is based on designing nested codebooks along with a learning algorithm based on progressive learning. Numerical results demonstrate that the proposed scheme yields remote deep inference that operates with multiple rates while approaching the performance of fixed-rate models.","PeriodicalId":158726,"journal":{"name":"2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSPW59220.2023.10193526","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Remote inference accommodates a broad range of scenarios, where inference is carried out using data acquired at a remote user. When the sensing and inferring users communicate over rate limited channels, compression of the data reduces latency, and deep learning enables to jointly learn the compression encoding along with the inference rule. However, because the data is compressed into a fixed number of bits, the resolution cannot be adapted to changes in channel conditions. In this work we propose a multi-rate remote deep inference scheme, which trains a single encoder-decoder model that uses learned vector quantizers while supporting different quantization levels. Our scheme is based on designing nested codebooks along with a learning algorithm based on progressive learning. Numerical results demonstrate that the proposed scheme yields remote deep inference that operates with multiple rates while approaching the performance of fixed-rate models.