{"title":"On Objective and Subjective Quality of 6DoF Synthesized Live Immersive Videos","authors":"Yuan-Chun Sun, Shengkun Tang, Ching-Ting Wang, Cheng-Hsin Hsu","doi":"10.1145/3552469.3555709","DOIUrl":null,"url":null,"abstract":"We address the problem of quantifying the perceived quality in 6DoF (Degree-of-Freedom) live immersive video in two steps. First, we develop a set of tools to generate (or collect) datasets in a photorealistic simulator, AirSim. Using these tools, we get to change diverse settings of live immersive videos, such as scenes, trajectories, camera placements, and encoding parameters. Second, we develop objective and subjective evaluation procedures, and carry out evaluations on a sample immersive video codec, MPEG MIV, using our own dataset. Several insights were found through our experiments: (1) the two synthesizers in TMIV produce comparable target view quality, but RVS runs 2 times faster; (2) Quantization Parameter (QP) is a good control knob to exercise target view quality and bitrate, but camera placements (or trajectories) also impose significant impacts; and (3) overall subjective quality has strong linear/rank correlation with subjective similarity, sharpness, and color. These findings shed some light on the future research problems for the development of emerging applications relying on immersive interactions.","PeriodicalId":296389,"journal":{"name":"Proceedings of the 2nd Workshop on Quality of Experience in Visual Multimedia Applications","volume":"53 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2nd Workshop on Quality of Experience in Visual Multimedia Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3552469.3555709","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
We address the problem of quantifying the perceived quality in 6DoF (Degree-of-Freedom) live immersive video in two steps. First, we develop a set of tools to generate (or collect) datasets in a photorealistic simulator, AirSim. Using these tools, we get to change diverse settings of live immersive videos, such as scenes, trajectories, camera placements, and encoding parameters. Second, we develop objective and subjective evaluation procedures, and carry out evaluations on a sample immersive video codec, MPEG MIV, using our own dataset. Several insights were found through our experiments: (1) the two synthesizers in TMIV produce comparable target view quality, but RVS runs 2 times faster; (2) Quantization Parameter (QP) is a good control knob to exercise target view quality and bitrate, but camera placements (or trajectories) also impose significant impacts; and (3) overall subjective quality has strong linear/rank correlation with subjective similarity, sharpness, and color. These findings shed some light on the future research problems for the development of emerging applications relying on immersive interactions.