Blind Video Quality Assessment via Space-Time Slice Statistics

Qi Zheng, Zhengzhong Tu, Zhijian Hao, Xiaoyang Zeng, A. Bovik, Yibo Fan
{"title":"Blind Video Quality Assessment via Space-Time Slice Statistics","authors":"Qi Zheng, Zhengzhong Tu, Zhijian Hao, Xiaoyang Zeng, A. Bovik, Yibo Fan","doi":"10.1109/ICIP46576.2022.9897565","DOIUrl":null,"url":null,"abstract":"User-generated contents (UGC) have gained increased attention in the video quality community recently. Perceptual video quality assessment (VQA) of UGC videos is of great significance for content providers to monitor, process, and deliver massive numbers of UGC videos. Blind video quality prediction of UGC videos is challenging since complex mixtures of spatial and temporal distortions contribute to the overall perceptual quality. In this paper, we develop a simple, effective, and efficient blind VQA framework (STS-QA) based on the statistical analysis of space-time slices (STS) of videos. Specifically, we extract spatio-temporal statistical features along different orientations of video STS, that capture directional global motion, then train a shallow quality predictor. The proposed framework can be used to easily extend any existing video/image quality model to account for temporal or motion regularities. Our experimental results on three publicly available UGC databases demonstrate that our proposed STS-QA model can significantly boost prediction performance compared to baselines. The code will be released at: https://github.com/uniqzheng/STS_BVQA.","PeriodicalId":387035,"journal":{"name":"2022 IEEE International Conference on Image Processing (ICIP)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Image Processing (ICIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIP46576.2022.9897565","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

User-generated contents (UGC) have gained increased attention in the video quality community recently. Perceptual video quality assessment (VQA) of UGC videos is of great significance for content providers to monitor, process, and deliver massive numbers of UGC videos. Blind video quality prediction of UGC videos is challenging since complex mixtures of spatial and temporal distortions contribute to the overall perceptual quality. In this paper, we develop a simple, effective, and efficient blind VQA framework (STS-QA) based on the statistical analysis of space-time slices (STS) of videos. Specifically, we extract spatio-temporal statistical features along different orientations of video STS, that capture directional global motion, then train a shallow quality predictor. The proposed framework can be used to easily extend any existing video/image quality model to account for temporal or motion regularities. Our experimental results on three publicly available UGC databases demonstrate that our proposed STS-QA model can significantly boost prediction performance compared to baselines. The code will be released at: https://github.com/uniqzheng/STS_BVQA.
基于时空切片统计的盲视频质量评估
最近,用户生成内容(UGC)在视频质量界受到了越来越多的关注。UGC视频的感知视频质量评估(Perceptual video quality assessment, VQA)对于内容提供商监控、处理和投放海量UGC视频具有重要意义。UGC视频的盲目视频质量预测具有挑战性,因为空间和时间扭曲的复杂混合会影响整体感知质量。本文基于视频时空切片(STS)的统计分析,开发了一种简单、有效、高效的盲VQA框架(STS- qa)。具体来说,我们沿着视频STS的不同方向提取时空统计特征,捕获定向全局运动,然后训练一个浅质量预测器。提出的框架可以很容易地扩展任何现有的视频/图像质量模型,以解释时间或运动规律。我们在三个公开可用的UGC数据库上的实验结果表明,与基线相比,我们提出的STS-QA模型可以显著提高预测性能。代码将在https://github.com/uniqzheng/STS_BVQA上发布。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信