{"title":"Fundamental limits on power consumption for lossless signal reconstruction","authors":"P. Grover","doi":"10.1109/ITW.2012.6404730","DOIUrl":null,"url":null,"abstract":"Does approaching fundamental limits on rates of information acquisitionor transmission fundamentally require increased power consumption in the processing circuitry? Our recent work shows that this is the case for channel coding for some simple circuit and channel models. In this paper, we first develop parallel results for source coding. Reinterpreting existing results on complexity of lossless source coding, we first observe that the sum power consumed in computational nodes in the circuitry of the encoder and the decoder diverges to infinity as the target error probability approaches zero and the coding rate approaches the source entropy. Next, focusing on on-chip wires, we show that the power consumed in circuit wiring also diverges to infinity as the error probability approaches zero. For the closely related problem of recovering a sparse signal, we first derive a fundamental bound on the required number of “finite-capacity” (e.g. quantized or noisy) measurements. By extending our bounds on wiring complexity and power consumption to sparse-signal recovery, we observe that there is a tradeoff between measurement power and power required to compute the recovered signal.","PeriodicalId":325771,"journal":{"name":"2012 IEEE Information Theory Workshop","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE Information Theory Workshop","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITW.2012.6404730","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Does approaching fundamental limits on rates of information acquisitionor transmission fundamentally require increased power consumption in the processing circuitry? Our recent work shows that this is the case for channel coding for some simple circuit and channel models. In this paper, we first develop parallel results for source coding. Reinterpreting existing results on complexity of lossless source coding, we first observe that the sum power consumed in computational nodes in the circuitry of the encoder and the decoder diverges to infinity as the target error probability approaches zero and the coding rate approaches the source entropy. Next, focusing on on-chip wires, we show that the power consumed in circuit wiring also diverges to infinity as the error probability approaches zero. For the closely related problem of recovering a sparse signal, we first derive a fundamental bound on the required number of “finite-capacity” (e.g. quantized or noisy) measurements. By extending our bounds on wiring complexity and power consumption to sparse-signal recovery, we observe that there is a tradeoff between measurement power and power required to compute the recovered signal.