TUMTraf 事件:校准和融合形成路边基于事件和 RGB 摄像机的数据集

IF 14 1区 工程技术 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Christian Creß;Walter Zimmer;Nils Purschke;Bach Ngoc Doan;Sven Kirchner;Venkatnarayanan Lakshminarasimhan;Leah Strand;Alois C. Knoll
{"title":"TUMTraf 事件:校准和融合形成路边基于事件和 RGB 摄像机的数据集","authors":"Christian Creß;Walter Zimmer;Nils Purschke;Bach Ngoc Doan;Sven Kirchner;Venkatnarayanan Lakshminarasimhan;Leah Strand;Alois C. Knoll","doi":"10.1109/TIV.2024.3393749","DOIUrl":null,"url":null,"abstract":"Event-based cameras are predestined for Intelligent Transportation Systems (ITS). They provide very high temporal resolution and dynamic range, which can eliminate motion blur and improve detection performance at night. However, event-based images lack color and texture compared to images from a conventional RGB camera. Considering that, data fusion between event-based and conventional cameras can combine the strengths of both modalities. For this purpose, extrinsic calibration is necessary. To the best of our knowledge, no targetless calibration between event-based and RGB cameras can handle multiple moving objects, nor does data fusion optimized for the domain of roadside ITS exist. Furthermore, synchronized event-based and RGB camera datasets considering roadside perspective are not yet published. To fill these research gaps, based on our previous work, we extended our targetless calibration approach with clustering methods to handle multiple moving objects. Furthermore, we developed an Early Fusion, Simple Late Fusion, and a novel Spatiotemporal Late Fusion method. Lastly, we published the TUMTraf Event Dataset, which contains more than 4,111 synchronized event-based and RGB images with 50,496 labeled 2D boxes. During our extensive experiments, we verified the effectiveness of our calibration method with multiple moving objects. Furthermore, compared to a single RGB camera, we increased the detection performance of up to +9% mAP in the day and up to +13% mAP during the challenging night with our presented event-based sensor fusion methods.","PeriodicalId":36532,"journal":{"name":"IEEE Transactions on Intelligent Vehicles","volume":"9 7","pages":"5186-5203"},"PeriodicalIF":14.0000,"publicationDate":"2024-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10508494","citationCount":"0","resultStr":"{\"title\":\"TUMTraf Event: Calibration and Fusion Resulting in a Dataset for Roadside Event-Based and RGB Cameras\",\"authors\":\"Christian Creß;Walter Zimmer;Nils Purschke;Bach Ngoc Doan;Sven Kirchner;Venkatnarayanan Lakshminarasimhan;Leah Strand;Alois C. Knoll\",\"doi\":\"10.1109/TIV.2024.3393749\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Event-based cameras are predestined for Intelligent Transportation Systems (ITS). They provide very high temporal resolution and dynamic range, which can eliminate motion blur and improve detection performance at night. However, event-based images lack color and texture compared to images from a conventional RGB camera. Considering that, data fusion between event-based and conventional cameras can combine the strengths of both modalities. For this purpose, extrinsic calibration is necessary. To the best of our knowledge, no targetless calibration between event-based and RGB cameras can handle multiple moving objects, nor does data fusion optimized for the domain of roadside ITS exist. Furthermore, synchronized event-based and RGB camera datasets considering roadside perspective are not yet published. To fill these research gaps, based on our previous work, we extended our targetless calibration approach with clustering methods to handle multiple moving objects. Furthermore, we developed an Early Fusion, Simple Late Fusion, and a novel Spatiotemporal Late Fusion method. Lastly, we published the TUMTraf Event Dataset, which contains more than 4,111 synchronized event-based and RGB images with 50,496 labeled 2D boxes. During our extensive experiments, we verified the effectiveness of our calibration method with multiple moving objects. Furthermore, compared to a single RGB camera, we increased the detection performance of up to +9% mAP in the day and up to +13% mAP during the challenging night with our presented event-based sensor fusion methods.\",\"PeriodicalId\":36532,\"journal\":{\"name\":\"IEEE Transactions on Intelligent Vehicles\",\"volume\":\"9 7\",\"pages\":\"5186-5203\"},\"PeriodicalIF\":14.0000,\"publicationDate\":\"2024-04-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10508494\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Intelligent Vehicles\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10508494/\",\"RegionNum\":1,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Intelligent Vehicles","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10508494/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

基于事件的摄像机是智能交通系统(ITS)的首选。它们具有极高的时间分辨率和动态范围,可以消除运动模糊,提高夜间检测性能。然而,与传统 RGB 摄像机的图像相比,基于事件的图像缺乏色彩和纹理。考虑到这一点,基于事件的相机和传统相机之间的数据融合可以结合两种模式的优势。为此,需要进行外部校准。据我们所知,基于事件的摄像机和 RGB 摄像机之间的无目标校准无法处理多个移动物体,也不存在针对路边智能交通系统领域进行优化的数据融合。此外,考虑到路边视角的基于事件和 RGB 摄像机同步数据集尚未发布。为了填补这些研究空白,我们在之前工作的基础上,利用聚类方法扩展了无目标校准方法,以处理多个移动物体。此外,我们还开发了早期融合、简单后期融合和新型时空后期融合方法。最后,我们发布了 TUMTraf 事件数据集,其中包含超过 4,111 张基于事件的同步 RGB 图像和 50,496 个标记的二维方框。在广泛的实验中,我们验证了我们的校准方法对多个移动物体的有效性。此外,与单个 RGB 摄像机相比,我们提出的基于事件的传感器融合方法在白天的检测性能提高了 +9% mAP,在具有挑战性的夜间的检测性能提高了 +13% mAP。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
TUMTraf Event: Calibration and Fusion Resulting in a Dataset for Roadside Event-Based and RGB Cameras
Event-based cameras are predestined for Intelligent Transportation Systems (ITS). They provide very high temporal resolution and dynamic range, which can eliminate motion blur and improve detection performance at night. However, event-based images lack color and texture compared to images from a conventional RGB camera. Considering that, data fusion between event-based and conventional cameras can combine the strengths of both modalities. For this purpose, extrinsic calibration is necessary. To the best of our knowledge, no targetless calibration between event-based and RGB cameras can handle multiple moving objects, nor does data fusion optimized for the domain of roadside ITS exist. Furthermore, synchronized event-based and RGB camera datasets considering roadside perspective are not yet published. To fill these research gaps, based on our previous work, we extended our targetless calibration approach with clustering methods to handle multiple moving objects. Furthermore, we developed an Early Fusion, Simple Late Fusion, and a novel Spatiotemporal Late Fusion method. Lastly, we published the TUMTraf Event Dataset, which contains more than 4,111 synchronized event-based and RGB images with 50,496 labeled 2D boxes. During our extensive experiments, we verified the effectiveness of our calibration method with multiple moving objects. Furthermore, compared to a single RGB camera, we increased the detection performance of up to +9% mAP in the day and up to +13% mAP during the challenging night with our presented event-based sensor fusion methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Intelligent Vehicles
IEEE Transactions on Intelligent Vehicles Mathematics-Control and Optimization
CiteScore
12.10
自引率
13.40%
发文量
177
期刊介绍: The IEEE Transactions on Intelligent Vehicles (T-IV) is a premier platform for publishing peer-reviewed articles that present innovative research concepts, application results, significant theoretical findings, and application case studies in the field of intelligent vehicles. With a particular emphasis on automated vehicles within roadway environments, T-IV aims to raise awareness of pressing research and application challenges. Our focus is on providing critical information to the intelligent vehicle community, serving as a dissemination vehicle for IEEE ITS Society members and others interested in learning about the state-of-the-art developments and progress in research and applications related to intelligent vehicles. Join us in advancing knowledge and innovation in this dynamic field.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信