Hao Du, Guoan Cheng, Ai Matsune, Qiang Zhu, Shu Zhan
{"title":"Uncertainty Estimation for Efficient Monocular Depth Perception","authors":"Hao Du, Guoan Cheng, Ai Matsune, Qiang Zhu, Shu Zhan","doi":"10.1109/CACML55074.2022.00138","DOIUrl":null,"url":null,"abstract":"In monocular depth perception, the ground truths always contain wrong depth values. Network performance suffers when such data are used for training. To this end, a modified uncertainty loss is proposed to monocular depth estimation to alleviate this issue. The epistemic uncertainty is calculated in logarithm space, while the aleatoric uncertainty is unchanged. The experimental results demonstrate that our method outperforms the previous state-of-the-art, yielding the highest performance on the NYU-Depth-v2 dataset in all metrics. Besides, the uncertainty maps help evaluate the area's estimation quality qualitatively,","PeriodicalId":137505,"journal":{"name":"2022 Asia Conference on Algorithms, Computing and Machine Learning (CACML)","volume":"9 5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Asia Conference on Algorithms, Computing and Machine Learning (CACML)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CACML55074.2022.00138","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In monocular depth perception, the ground truths always contain wrong depth values. Network performance suffers when such data are used for training. To this end, a modified uncertainty loss is proposed to monocular depth estimation to alleviate this issue. The epistemic uncertainty is calculated in logarithm space, while the aleatoric uncertainty is unchanged. The experimental results demonstrate that our method outperforms the previous state-of-the-art, yielding the highest performance on the NYU-Depth-v2 dataset in all metrics. Besides, the uncertainty maps help evaluate the area's estimation quality qualitatively,