{"title":"视野合成的距离引导深度细化和不确定性感知聚合","authors":"Yuan Chang, Yisong Chen, Guoping Wang","doi":"10.1109/ICASSP39728.2021.9413981","DOIUrl":null,"url":null,"abstract":"In this paper, we present a framework of view synthesis, including range guided depth refinement and uncertainty-aware aggregation based novel view synthesis. We first propose a novel depth refinement method to improve the quality and robustness of the depth map reconstruction. To that end, we use a range prior to constrain the estimated depth, which helps us to get more accurate depth information. Then we propose an uncertainty-aware aggregation method for novel view synthesis. We compute the uncertainty of the estimated depth for each pixel, and reduce the influence of pixels whose uncertainty are large when synthesizing novel views. This step helps to reduce some artifacts such as ghost and blur. We validate the performance of our algorithm experimentally, and we show that our approach achieves state-of-the-art performance.","PeriodicalId":347060,"journal":{"name":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Range Guided Depth Refinement and Uncertainty-Aware Aggregation for View Synthesis\",\"authors\":\"Yuan Chang, Yisong Chen, Guoping Wang\",\"doi\":\"10.1109/ICASSP39728.2021.9413981\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we present a framework of view synthesis, including range guided depth refinement and uncertainty-aware aggregation based novel view synthesis. We first propose a novel depth refinement method to improve the quality and robustness of the depth map reconstruction. To that end, we use a range prior to constrain the estimated depth, which helps us to get more accurate depth information. Then we propose an uncertainty-aware aggregation method for novel view synthesis. We compute the uncertainty of the estimated depth for each pixel, and reduce the influence of pixels whose uncertainty are large when synthesizing novel views. This step helps to reduce some artifacts such as ghost and blur. We validate the performance of our algorithm experimentally, and we show that our approach achieves state-of-the-art performance.\",\"PeriodicalId\":347060,\"journal\":{\"name\":\"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICASSP39728.2021.9413981\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP39728.2021.9413981","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Range Guided Depth Refinement and Uncertainty-Aware Aggregation for View Synthesis
In this paper, we present a framework of view synthesis, including range guided depth refinement and uncertainty-aware aggregation based novel view synthesis. We first propose a novel depth refinement method to improve the quality and robustness of the depth map reconstruction. To that end, we use a range prior to constrain the estimated depth, which helps us to get more accurate depth information. Then we propose an uncertainty-aware aggregation method for novel view synthesis. We compute the uncertainty of the estimated depth for each pixel, and reduce the influence of pixels whose uncertainty are large when synthesizing novel views. This step helps to reduce some artifacts such as ghost and blur. We validate the performance of our algorithm experimentally, and we show that our approach achieves state-of-the-art performance.