Mohammed Salah;Abdulla Ayyad;Muhammad Humais;Daniel Gehrig;Abdelqader Abusafieh;Lakmal Seneviratne;Davide Scaramuzza;Yahya Zweiri
{"title":"E-Calib: A Fast, Robust, and Accurate Calibration Toolbox for Event Cameras","authors":"Mohammed Salah;Abdulla Ayyad;Muhammad Humais;Daniel Gehrig;Abdelqader Abusafieh;Lakmal Seneviratne;Davide Scaramuzza;Yahya Zweiri","doi":"10.1109/TIP.2024.3410673","DOIUrl":null,"url":null,"abstract":"Event cameras triggered a paradigm shift in the computer vision community delineated by their asynchronous nature, low latency, and high dynamic range. Calibration of event cameras is always essential to account for the sensor intrinsic parameters and for 3D perception. However, conventional image-based calibration techniques are not applicable due to the asynchronous, binary output of the sensor. The current standard for calibrating event cameras relies on either blinking patterns or event-based image reconstruction algorithms. These approaches are difficult to deploy in factory settings and are affected by noise and artifacts degrading the calibration performance. To bridge these limitations, we present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras utilizing the asymmetric circle grid, for its robustness to out-of-focus scenes. E-Calib introduces an efficient reweighted least squares (eRWLS) method for feature extraction of the calibration pattern circles with sub-pixel accuracy and robustness to noise. In addition, a modified hierarchical clustering algorithm is devised to detect the calibration grid apart from the background clutter. The proposed method is tested in a variety of rigorous experiments for different event camera models, on circle grids with different geometric properties, on varying calibration trajectories and speeds, and under challenging illumination conditions. The results show that our approach outperforms the state-of-the-art in detection success rate, reprojection error, and pose estimation accuracy.","PeriodicalId":94032,"journal":{"name":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10555516","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10555516/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Event cameras triggered a paradigm shift in the computer vision community delineated by their asynchronous nature, low latency, and high dynamic range. Calibration of event cameras is always essential to account for the sensor intrinsic parameters and for 3D perception. However, conventional image-based calibration techniques are not applicable due to the asynchronous, binary output of the sensor. The current standard for calibrating event cameras relies on either blinking patterns or event-based image reconstruction algorithms. These approaches are difficult to deploy in factory settings and are affected by noise and artifacts degrading the calibration performance. To bridge these limitations, we present E-Calib, a novel, fast, robust, and accurate calibration toolbox for event cameras utilizing the asymmetric circle grid, for its robustness to out-of-focus scenes. E-Calib introduces an efficient reweighted least squares (eRWLS) method for feature extraction of the calibration pattern circles with sub-pixel accuracy and robustness to noise. In addition, a modified hierarchical clustering algorithm is devised to detect the calibration grid apart from the background clutter. The proposed method is tested in a variety of rigorous experiments for different event camera models, on circle grids with different geometric properties, on varying calibration trajectories and speeds, and under challenging illumination conditions. The results show that our approach outperforms the state-of-the-art in detection success rate, reprojection error, and pose estimation accuracy.