Separable KLT for Intra Coding in Versatile Video Coding (VVC)

Kui Fan, Ronggang Wang, Weisi Lin, Jong-Uk Hou, Ling-Yu Duan, Ge Li, Wen Gao
{"title":"Separable KLT for Intra Coding in Versatile Video Coding (VVC)","authors":"Kui Fan, Ronggang Wang, Weisi Lin, Jong-Uk Hou, Ling-Yu Duan, Ge Li, Wen Gao","doi":"10.1109/DCC.2019.00083","DOIUrl":null,"url":null,"abstract":"After the works on the state-of-the-art High Efficiency Video Coding (HEVC) standard, the standard organizations continued to study the potential video coding technologies for the next generation of video coding standard, named Versatile Video Coding (VVC). Transform is a key technique for compression efficiency, and core experiment 6 (CE6) is carried out to explore the transform related coding tools. In this paper, we propose a novel separable transform based on Karhunen-Loève Transform (KLT) to eliminate the horizontal and vertical correlations in the residual samples of intra coding. In the proposed method, the weaknesses of the traditional KLT are addressed. The separable KLT is developed as an alternative transform type in addition to DCT-II, and the transform matrices from 4×4 to 64×64 are trained from intra residual samples. Experimental results show the proposed method can achieve 2.7% bitrate saving averagely on top of the reference software of VVC (VTM-1.1), and the consistent performance improvement on test set also validates the strong generalization capacity of the proposed separable KLT.","PeriodicalId":167723,"journal":{"name":"2019 Data Compression Conference (DCC)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 Data Compression Conference (DCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DCC.2019.00083","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

After the works on the state-of-the-art High Efficiency Video Coding (HEVC) standard, the standard organizations continued to study the potential video coding technologies for the next generation of video coding standard, named Versatile Video Coding (VVC). Transform is a key technique for compression efficiency, and core experiment 6 (CE6) is carried out to explore the transform related coding tools. In this paper, we propose a novel separable transform based on Karhunen-Loève Transform (KLT) to eliminate the horizontal and vertical correlations in the residual samples of intra coding. In the proposed method, the weaknesses of the traditional KLT are addressed. The separable KLT is developed as an alternative transform type in addition to DCT-II, and the transform matrices from 4×4 to 64×64 are trained from intra residual samples. Experimental results show the proposed method can achieve 2.7% bitrate saving averagely on top of the reference software of VVC (VTM-1.1), and the consistent performance improvement on test set also validates the strong generalization capacity of the proposed separable KLT.
可分离KLT在多用途视频编码(VVC)中的应用
在制定了最先进的高效视频编码(HEVC)标准之后,标准组织继续研究下一代视频编码标准的潜在视频编码技术,称为通用视频编码(VVC)。变换是提高压缩效率的关键技术,通过核心实验6 (CE6)探索变换相关的编码工具。本文提出了一种新的基于karhunen - lo变换(KLT)的可分离变换来消除图像编码残差样本中的水平相关性和垂直相关性。该方法克服了传统KLT算法的不足。可分离的KLT被开发为除了DCT-II之外的另一种变换类型,并且从4×4到64×64的变换矩阵是从内部残差样本中训练的。实验结果表明,该方法在参考软件VVC (VTM-1.1)的基础上平均节省了2.7%的比特率,测试集上的一致性性能提升也验证了所提可分离KLT具有较强的泛化能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信