Howard Zhang , Ash Liu , Saied Habibi , Martin v. Mohrenschildt , Ryan Ahmed
{"title":"CMHT autonomous dataset: A multi-sensor dataset including radar and IR for autonomous driving","authors":"Howard Zhang , Ash Liu , Saied Habibi , Martin v. Mohrenschildt , Ryan Ahmed","doi":"10.1016/j.dib.2025.111552","DOIUrl":null,"url":null,"abstract":"<div><div>Standardized datasets are essential for the development and evaluation of autonomous driving algorithms. As the types of sensors available to researchers increase, datasets containing a variety of temporally and spatially aligned sensors have become increasingly valuable. This paper presents a driving dataset recorded using a complete sensor suite for research on autonomous driving, perception, and sensor fusion. The dataset consists of over 9000 frames of data recorded at 10-20Hz using a complete sensor suite made up of Velodyne LiDAR, GPS/IMU, mm-wave radar, as well as color and infrared cameras. The capture scenarios include poor weather/lighting conditions, such as rain/night scenarios, and diverse traffic conditions, such as highways and cities with various objects. Both fully synchronized data and raw recordings in the form of ROS2 bags are provided, as well as 3D tracklet labels for individual objects. This paper provides technical details on the driving platform, data format, and utilities.</div></div>","PeriodicalId":10973,"journal":{"name":"Data in Brief","volume":"60 ","pages":"Article 111552"},"PeriodicalIF":1.0000,"publicationDate":"2025-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data in Brief","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2352340925002847","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Standardized datasets are essential for the development and evaluation of autonomous driving algorithms. As the types of sensors available to researchers increase, datasets containing a variety of temporally and spatially aligned sensors have become increasingly valuable. This paper presents a driving dataset recorded using a complete sensor suite for research on autonomous driving, perception, and sensor fusion. The dataset consists of over 9000 frames of data recorded at 10-20Hz using a complete sensor suite made up of Velodyne LiDAR, GPS/IMU, mm-wave radar, as well as color and infrared cameras. The capture scenarios include poor weather/lighting conditions, such as rain/night scenarios, and diverse traffic conditions, such as highways and cities with various objects. Both fully synchronized data and raw recordings in the form of ROS2 bags are provided, as well as 3D tracklet labels for individual objects. This paper provides technical details on the driving platform, data format, and utilities.
期刊介绍:
Data in Brief provides a way for researchers to easily share and reuse each other''s datasets by publishing data articles that: -Thoroughly describe your data, facilitating reproducibility. -Make your data, which is often buried in supplementary material, easier to find. -Increase traffic towards associated research articles and data, leading to more citations. -Open up doors for new collaborations. Because you never know what data will be useful to someone else, Data in Brief welcomes submissions that describe data from all research areas.