{"title":"相对于地面真值位置和方向的二维相机阵列多相机标定算法的评估","authors":"Elijs Dima, Mårten Sjöström, R. Olsson","doi":"10.1109/3DTV.2016.7548887","DOIUrl":null,"url":null,"abstract":"Camera calibration methods are commonly evaluated on cumulative reprojection error metrics, on disparate one-dimensional datasets. To evaluate calibration of cameras in two-dimensional ar-rays, assessments need to be made on two-dimensional datasets with constraints on camera parameters. In this study, accuracy of several multi-camera calibration methods has been evaluated on camera parameters that are affecting view projection the most. As input data, we used a 15-viewpoint two-dimensional dataset with intrinsic and extrinsic parameter constraints and extrinsic ground truth. The assessment showed that self-calibration methods using structure-from-motion reach equal intrinsic and extrinsic parameter estimation accuracy with standard checkerboard calibration algorithm, and surpass a well-known self-calibration toolbox, BlueCCal. These results show that self-calibration is a viable approach to calibrating two-dimensional camera arrays, but improvements to state-of-art multi-camera feature matching are necessary to make BlueCCal as accurate as other self-calibration methods for two-dimensional camera arrays.","PeriodicalId":378956,"journal":{"name":"2016 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Assessment of multi-camera calibration algorithms for two-dimensional camera arrays relative to ground truth position and direction\",\"authors\":\"Elijs Dima, Mårten Sjöström, R. Olsson\",\"doi\":\"10.1109/3DTV.2016.7548887\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Camera calibration methods are commonly evaluated on cumulative reprojection error metrics, on disparate one-dimensional datasets. To evaluate calibration of cameras in two-dimensional ar-rays, assessments need to be made on two-dimensional datasets with constraints on camera parameters. In this study, accuracy of several multi-camera calibration methods has been evaluated on camera parameters that are affecting view projection the most. As input data, we used a 15-viewpoint two-dimensional dataset with intrinsic and extrinsic parameter constraints and extrinsic ground truth. The assessment showed that self-calibration methods using structure-from-motion reach equal intrinsic and extrinsic parameter estimation accuracy with standard checkerboard calibration algorithm, and surpass a well-known self-calibration toolbox, BlueCCal. These results show that self-calibration is a viable approach to calibrating two-dimensional camera arrays, but improvements to state-of-art multi-camera feature matching are necessary to make BlueCCal as accurate as other self-calibration methods for two-dimensional camera arrays.\",\"PeriodicalId\":378956,\"journal\":{\"name\":\"2016 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON)\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-07-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/3DTV.2016.7548887\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/3DTV.2016.7548887","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Assessment of multi-camera calibration algorithms for two-dimensional camera arrays relative to ground truth position and direction
Camera calibration methods are commonly evaluated on cumulative reprojection error metrics, on disparate one-dimensional datasets. To evaluate calibration of cameras in two-dimensional ar-rays, assessments need to be made on two-dimensional datasets with constraints on camera parameters. In this study, accuracy of several multi-camera calibration methods has been evaluated on camera parameters that are affecting view projection the most. As input data, we used a 15-viewpoint two-dimensional dataset with intrinsic and extrinsic parameter constraints and extrinsic ground truth. The assessment showed that self-calibration methods using structure-from-motion reach equal intrinsic and extrinsic parameter estimation accuracy with standard checkerboard calibration algorithm, and surpass a well-known self-calibration toolbox, BlueCCal. These results show that self-calibration is a viable approach to calibrating two-dimensional camera arrays, but improvements to state-of-art multi-camera feature matching are necessary to make BlueCCal as accurate as other self-calibration methods for two-dimensional camera arrays.