Asymmetric coding of stereoscopic 3D based on perceptual significance

Sid Ahmed Fezza, M. Larabi, K. Faraoun
{"title":"Asymmetric coding of stereoscopic 3D based on perceptual significance","authors":"Sid Ahmed Fezza, M. Larabi, K. Faraoun","doi":"10.1109/ICIP.2014.7026144","DOIUrl":null,"url":null,"abstract":"Asymmetric stereoscopic coding is a very promising technique to decrease the bandwidth required for stereoscopic 3D delivery. However, one large obstacle is linked to the limit of asymmetric coding or the just noticeable threshold of asymmetry, so that 3D viewing experience is not altered. By way of subjective experiments, recent works have attempted to identify this asymmetry threshold. However, fixed threshold, highly dependent on the experiment design, do not allow to adapt to quality and content variation of the image. In this paper, we propose a new non-uniform asymmetric stereoscopic coding adjusting in a dynamic manner the level of asymmetry for each image region to ensure unaltered binocular perception. This is achieved by exploiting several HVS-inspired models; specifically we used the Binocular Just Noticeable Difference (BJND) combined with visual saliency map and depth information to quantify precisely the asymmetry threshold. Simulation results show that the proposed method results in up to 44% of bitrate saving and provides better 3D visual quality compared to state-of-the-art asymmetric coding methods.","PeriodicalId":6856,"journal":{"name":"2014 IEEE International Conference on Image Processing (ICIP)","volume":"29 1","pages":"5656-5660"},"PeriodicalIF":0.0000,"publicationDate":"2014-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE International Conference on Image Processing (ICIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIP.2014.7026144","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Asymmetric stereoscopic coding is a very promising technique to decrease the bandwidth required for stereoscopic 3D delivery. However, one large obstacle is linked to the limit of asymmetric coding or the just noticeable threshold of asymmetry, so that 3D viewing experience is not altered. By way of subjective experiments, recent works have attempted to identify this asymmetry threshold. However, fixed threshold, highly dependent on the experiment design, do not allow to adapt to quality and content variation of the image. In this paper, we propose a new non-uniform asymmetric stereoscopic coding adjusting in a dynamic manner the level of asymmetry for each image region to ensure unaltered binocular perception. This is achieved by exploiting several HVS-inspired models; specifically we used the Binocular Just Noticeable Difference (BJND) combined with visual saliency map and depth information to quantify precisely the asymmetry threshold. Simulation results show that the proposed method results in up to 44% of bitrate saving and provides better 3D visual quality compared to state-of-the-art asymmetric coding methods.
基于感知意义的立体三维非对称编码
非对称立体编码是一种很有前途的技术,可以减少立体3D传输所需的带宽。然而,一个很大的障碍与非对称编码的限制或不对称的明显阈值有关,因此3D观看体验不会改变。通过主观实验,最近的工作试图确定这种不对称阈值。然而,固定的阈值,高度依赖于实验设计,不允许适应图像质量和内容的变化。在本文中,我们提出了一种新的非均匀不对称立体编码,以动态方式调整每个图像区域的不对称程度,以确保双眼感知不变。这是通过利用几种hvs启发模型实现的;具体来说,我们使用双目可注意差异(BJND)结合视觉显著性图和深度信息来精确量化不对称阈值。仿真结果表明,与目前最先进的非对称编码方法相比,该方法可节省高达44%的比特率,并提供更好的3D视觉质量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信