{"title":"模拟自动驾驶系统中的三维环境感知建模","authors":"Chunmian Lin;Daxin Tian;Xuting Duan;Jianshan Zhou","doi":"10.23919/CSMS.2021.0004","DOIUrl":null,"url":null,"abstract":"Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simulator, and investigated the three-dimensional environmental perception problem of the simulated system. Using the open-source CARLA simulator, we generated a CarlaSim from unreal traffic scenarios, comprising 15000 camera-LiDAR (Light Detection and Ranging) samples with annotations and calibration files. Then, we developed Multi-Sensor Fusion Perception (MSFP) model for consuming two-modal data and detecting objects in the scenes. Furthermore, we conducted experiments on the KITTI and CarlaSim datasets; the results demonstrated the effectiveness of our proposed methods in terms of perception accuracy, inference efficiency, and generalization performance. The results of this study will faciliate the future development of autonomous-driving simulated tests.","PeriodicalId":65786,"journal":{"name":"复杂系统建模与仿真(英文)","volume":"1 1","pages":"45-54"},"PeriodicalIF":0.0000,"publicationDate":"2021-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.23919/CSMS.2021.0004","citationCount":"5","resultStr":"{\"title\":\"3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems\",\"authors\":\"Chunmian Lin;Daxin Tian;Xuting Duan;Jianshan Zhou\",\"doi\":\"10.23919/CSMS.2021.0004\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simulator, and investigated the three-dimensional environmental perception problem of the simulated system. Using the open-source CARLA simulator, we generated a CarlaSim from unreal traffic scenarios, comprising 15000 camera-LiDAR (Light Detection and Ranging) samples with annotations and calibration files. Then, we developed Multi-Sensor Fusion Perception (MSFP) model for consuming two-modal data and detecting objects in the scenes. Furthermore, we conducted experiments on the KITTI and CarlaSim datasets; the results demonstrated the effectiveness of our proposed methods in terms of perception accuracy, inference efficiency, and generalization performance. The results of this study will faciliate the future development of autonomous-driving simulated tests.\",\"PeriodicalId\":65786,\"journal\":{\"name\":\"复杂系统建模与仿真(英文)\",\"volume\":\"1 1\",\"pages\":\"45-54\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.23919/CSMS.2021.0004\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"复杂系统建模与仿真(英文)\",\"FirstCategoryId\":\"1089\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/9426465/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"复杂系统建模与仿真(英文)","FirstCategoryId":"1089","ListUrlMain":"https://ieeexplore.ieee.org/document/9426465/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
3D Environmental Perception Modeling in the Simulated Autonomous-Driving Systems
Self-driving vehicles require a number of tests to prevent fatal accidents and ensure their appropriate operation in the physical world. However, conducting vehicle tests on the road is difficult because such tests are expensive and labor intensive. In this study, we used an autonomous-driving simulator, and investigated the three-dimensional environmental perception problem of the simulated system. Using the open-source CARLA simulator, we generated a CarlaSim from unreal traffic scenarios, comprising 15000 camera-LiDAR (Light Detection and Ranging) samples with annotations and calibration files. Then, we developed Multi-Sensor Fusion Perception (MSFP) model for consuming two-modal data and detecting objects in the scenes. Furthermore, we conducted experiments on the KITTI and CarlaSim datasets; the results demonstrated the effectiveness of our proposed methods in terms of perception accuracy, inference efficiency, and generalization performance. The results of this study will faciliate the future development of autonomous-driving simulated tests.