Christian Creß, Erik Schütz, B. L. Žagar, Alois Knoll
{"title":"智能交通系统中基于事件与RGB相机的无目标外部标定","authors":"Christian Creß, Erik Schütz, B. L. Žagar, Alois Knoll","doi":"10.1109/IV55152.2023.10186538","DOIUrl":null,"url":null,"abstract":"The perception of Intelligent Transportation Systems is mainly based on conventional cameras. Event-based cameras have a high potential to increase detection performance in such sensor systems. Therefore, an extrinsic calibration between these sensors is required. Since a target-based method with a checkerboard on the highway is impractical, a targetless approach is necessary. To the best of our knowledge, no working approach for targetless extrinsic calibration between event-based and conventional cameras in the domain of ITS exists. To fill this knowledge gap, we provide a targetless approach for extrinsic calibration. Our algorithm finds correspondences of the detected motion between both sensors using deep learning-based instance segmentation and sparse optical flow. Then, it calculates the transformation. We were able to verify the effectiveness of our method during experiments. Furthermore, we are comparable to existing multicamera calibration methods. Our approach can be used for targetless extrinsic calibration between event-based and conventional cameras.","PeriodicalId":195148,"journal":{"name":"2023 IEEE Intelligent Vehicles Symposium (IV)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Targetless Extrinsic Calibration Between Event-Based and RGB Camera for Intelligent Transportation Systems\",\"authors\":\"Christian Creß, Erik Schütz, B. L. Žagar, Alois Knoll\",\"doi\":\"10.1109/IV55152.2023.10186538\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The perception of Intelligent Transportation Systems is mainly based on conventional cameras. Event-based cameras have a high potential to increase detection performance in such sensor systems. Therefore, an extrinsic calibration between these sensors is required. Since a target-based method with a checkerboard on the highway is impractical, a targetless approach is necessary. To the best of our knowledge, no working approach for targetless extrinsic calibration between event-based and conventional cameras in the domain of ITS exists. To fill this knowledge gap, we provide a targetless approach for extrinsic calibration. Our algorithm finds correspondences of the detected motion between both sensors using deep learning-based instance segmentation and sparse optical flow. Then, it calculates the transformation. We were able to verify the effectiveness of our method during experiments. Furthermore, we are comparable to existing multicamera calibration methods. Our approach can be used for targetless extrinsic calibration between event-based and conventional cameras.\",\"PeriodicalId\":195148,\"journal\":{\"name\":\"2023 IEEE Intelligent Vehicles Symposium (IV)\",\"volume\":\"32 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Intelligent Vehicles Symposium (IV)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IV55152.2023.10186538\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Intelligent Vehicles Symposium (IV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IV55152.2023.10186538","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Targetless Extrinsic Calibration Between Event-Based and RGB Camera for Intelligent Transportation Systems
The perception of Intelligent Transportation Systems is mainly based on conventional cameras. Event-based cameras have a high potential to increase detection performance in such sensor systems. Therefore, an extrinsic calibration between these sensors is required. Since a target-based method with a checkerboard on the highway is impractical, a targetless approach is necessary. To the best of our knowledge, no working approach for targetless extrinsic calibration between event-based and conventional cameras in the domain of ITS exists. To fill this knowledge gap, we provide a targetless approach for extrinsic calibration. Our algorithm finds correspondences of the detected motion between both sensors using deep learning-based instance segmentation and sparse optical flow. Then, it calculates the transformation. We were able to verify the effectiveness of our method during experiments. Furthermore, we are comparable to existing multicamera calibration methods. Our approach can be used for targetless extrinsic calibration between event-based and conventional cameras.