Huang Huang, Yi Yang, Liang Tang, Zhang Zhang, Nailong Liu, Mou Li, Liang Wang
{"title":"A Multimodal Off-Road Terrain Classification Benchmark for Extraterrestrial Traversability Analysis","authors":"Huang Huang, Yi Yang, Liang Tang, Zhang Zhang, Nailong Liu, Mou Li, Liang Wang","doi":"10.1109/CyberC55534.2022.00028","DOIUrl":null,"url":null,"abstract":"A rover in extraterrestrial exploration works in challenging environment featured by primitive landforms and hidden dangerous areas. Due to the far distance from the rover to Earth, it is one of the most crucial capabilities that the rover can recognize and model the terrain properties efficiently and autonomously. In this paper, we present a Multimodal Off-road Terrain Classification (MOTC) dataset which is collected by a four-wheeled rover equipped with ego-centric visual cameras and inertial measurement unit (IMU). The dataset is generated from a boulder-strewn mock-up of the real Mars at the Intelligent Autonomous System Laboratory in Beijing Institute of Control Engineering. 24,982 images and corresponding sensor sequences are collected and annotated into 3 kinds of surface materials and 3 kinds of scene geometries. Based on the MOTC dataset, a baseline model with a multimodal fusion architecture is proposed for terrain classification. The experiment shows that the features extracted from visual images and from IMU complement each other to achieve improvements of terrain classification accuracy of the challenging extraterrestrial surface.","PeriodicalId":234632,"journal":{"name":"2022 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery (CyberC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CyberC55534.2022.00028","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A rover in extraterrestrial exploration works in challenging environment featured by primitive landforms and hidden dangerous areas. Due to the far distance from the rover to Earth, it is one of the most crucial capabilities that the rover can recognize and model the terrain properties efficiently and autonomously. In this paper, we present a Multimodal Off-road Terrain Classification (MOTC) dataset which is collected by a four-wheeled rover equipped with ego-centric visual cameras and inertial measurement unit (IMU). The dataset is generated from a boulder-strewn mock-up of the real Mars at the Intelligent Autonomous System Laboratory in Beijing Institute of Control Engineering. 24,982 images and corresponding sensor sequences are collected and annotated into 3 kinds of surface materials and 3 kinds of scene geometries. Based on the MOTC dataset, a baseline model with a multimodal fusion architecture is proposed for terrain classification. The experiment shows that the features extracted from visual images and from IMU complement each other to achieve improvements of terrain classification accuracy of the challenging extraterrestrial surface.