{"title":"Designing an Efficient Emergency Response Airborne Mapping System with Multiple Sensors","authors":"Chaoyong Shen, Zongjian Lin, Shaoqi Zhou, Xuling Luo, Yu Zhang","doi":"10.1155/2021/3228291","DOIUrl":null,"url":null,"abstract":"Multisource remote sensing data have been extensively used in disaster and emergency response management. Different types of visual and measured data, such as high-resolution orthoimages, real-time videos, accurate digital elevation models, and three-dimensional landscape maps, can enable producing effective rescue plans and aid the efficient dispatching of rescuers after disasters. Generally, such data are acquired using unmanned aerial vehicles equipped with multiple sensors. For typical application scenarios, efficient and real-time access to data is more important in emergency response cases than in traditional application scenarios. In this study, an efficient emergency response airborne mapping system equipped with multiple sensors was designed. The system comprises groups of wide-angle cameras, a high-definition video camera, an infrared video camera, a LiDAR system, and a global navigation satellite system/inertial measurement unit. The wide-angle cameras had a visual field of 85° × 105°, facilitating the efficient operation of the mapping system. Numerous calibrations were performed on the constructed mapping system. In particular, initial calibration and self-calibration were performed to determine the relative pose between different wide-angle cameras to fuse all the acquired images. The mapping system was then tested in an area with altitudes of 1000 m–1250 m. The biases of the wide-angle cameras were small bias values (0.090 m, −0.018 m, and −0.046 m in the x-, y-, and z-axes, respectively). Moreover, the root-mean-square error (RMSE) along the planer direction was smaller than that along the vertical direction (0.202 and 0.294 m, respectively). The LiDAR system achieved smaller biases (0.117, −0.020, and −0.039 m in the x-, y-, and z-axes, respectively) and a smaller RMSE in the vertical direction (0.192 m) than the wide-angle cameras; however, RMSE of the LiDAR system along the planar direction (0.276 m) was slightly larger. The proposed system shows potential for use in emergency response systems for efficiently acquiring data such as images and point clouds.","PeriodicalId":55995,"journal":{"name":"International Journal of Optics","volume":" ","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2021-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Optics","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1155/2021/3228291","RegionNum":4,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Multisource remote sensing data have been extensively used in disaster and emergency response management. Different types of visual and measured data, such as high-resolution orthoimages, real-time videos, accurate digital elevation models, and three-dimensional landscape maps, can enable producing effective rescue plans and aid the efficient dispatching of rescuers after disasters. Generally, such data are acquired using unmanned aerial vehicles equipped with multiple sensors. For typical application scenarios, efficient and real-time access to data is more important in emergency response cases than in traditional application scenarios. In this study, an efficient emergency response airborne mapping system equipped with multiple sensors was designed. The system comprises groups of wide-angle cameras, a high-definition video camera, an infrared video camera, a LiDAR system, and a global navigation satellite system/inertial measurement unit. The wide-angle cameras had a visual field of 85° × 105°, facilitating the efficient operation of the mapping system. Numerous calibrations were performed on the constructed mapping system. In particular, initial calibration and self-calibration were performed to determine the relative pose between different wide-angle cameras to fuse all the acquired images. The mapping system was then tested in an area with altitudes of 1000 m–1250 m. The biases of the wide-angle cameras were small bias values (0.090 m, −0.018 m, and −0.046 m in the x-, y-, and z-axes, respectively). Moreover, the root-mean-square error (RMSE) along the planer direction was smaller than that along the vertical direction (0.202 and 0.294 m, respectively). The LiDAR system achieved smaller biases (0.117, −0.020, and −0.039 m in the x-, y-, and z-axes, respectively) and a smaller RMSE in the vertical direction (0.192 m) than the wide-angle cameras; however, RMSE of the LiDAR system along the planar direction (0.276 m) was slightly larger. The proposed system shows potential for use in emergency response systems for efficiently acquiring data such as images and point clouds.
期刊介绍:
International Journal of Optics publishes papers on the nature of light, its properties and behaviours, and its interaction with matter. The journal considers both fundamental and highly applied studies, especially those that promise technological solutions for the next generation of systems and devices. As well as original research, International Journal of Optics also publishes focused review articles that examine the state of the art, identify emerging trends, and suggest future directions for developing fields.