Hyperparameter Learning of Stochastic Image Generative Models with Bayesian Hierarchical Modeling and Its Effect on Lossless Image Coding

Yuta Nakahara, T. Matsushima
{"title":"Hyperparameter Learning of Stochastic Image Generative Models with Bayesian Hierarchical Modeling and Its Effect on Lossless Image Coding","authors":"Yuta Nakahara, T. Matsushima","doi":"10.1109/ITW48936.2021.9611418","DOIUrl":null,"url":null,"abstract":"Explicit assumption of stochastic data generative models is a remarkable feature of lossless compression of general data in information theory. However, current lossless image coding mostly focus on coding procedures without explicit assumption of the stochastic generative model. Therefore, we have difficulty discussing the theoretical optimality of the coding procedure to the stochastic generative model. In this paper, we solve this difficulty by constructing a stochastic generative model by interpreting the previous coding procedure from another perspective. An important problem of our approach is how to learn the hyperparameters of the stochastic generative model because the optimality of our coding algorithm is guaranteed only asymptotically and the hyperparameter setting still affects the expected code length for finite length data. For this problem, we use Bayesian hierarchical modeling and confirm its effect by numerical experiments. In lossless image coding, this is the first study assuming such an explicit stochastic generative model and learning its hyperparameters, to the best of our knowledge.","PeriodicalId":325229,"journal":{"name":"2021 IEEE Information Theory Workshop (ITW)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Information Theory Workshop (ITW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITW48936.2021.9611418","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Explicit assumption of stochastic data generative models is a remarkable feature of lossless compression of general data in information theory. However, current lossless image coding mostly focus on coding procedures without explicit assumption of the stochastic generative model. Therefore, we have difficulty discussing the theoretical optimality of the coding procedure to the stochastic generative model. In this paper, we solve this difficulty by constructing a stochastic generative model by interpreting the previous coding procedure from another perspective. An important problem of our approach is how to learn the hyperparameters of the stochastic generative model because the optimality of our coding algorithm is guaranteed only asymptotically and the hyperparameter setting still affects the expected code length for finite length data. For this problem, we use Bayesian hierarchical modeling and confirm its effect by numerical experiments. In lossless image coding, this is the first study assuming such an explicit stochastic generative model and learning its hyperparameters, to the best of our knowledge.
基于贝叶斯分层建模的随机图像生成模型的超参数学习及其对无损图像编码的影响
随机数据生成模型的显式假设是信息论中一般数据无损压缩的一个显著特征。然而,目前的图像无损编码大多集中在编码过程上,没有明确假设随机生成模型。因此,我们很难讨论编码过程对随机生成模型的理论最优性。在本文中,我们通过从另一个角度解释前面的编码过程,构建一个随机生成模型来解决这个难题。该方法的一个重要问题是如何学习随机生成模型的超参数,因为我们的编码算法只能渐近地保证最优性,并且超参数的设置仍然影响有限长度数据的期望编码长度。针对这一问题,我们采用贝叶斯分层模型,并通过数值实验验证了其效果。在无损图像编码中,这是第一个假设这样一个明确的随机生成模型并学习其超参数的研究,据我们所知。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信