Xueyong Wu, Jie Li, Cheng Zhang, Yu Yang, Yachao Yang
{"title":"基于视觉和IMU的无人机水平姿态动态对准与估计。","authors":"Xueyong Wu, Jie Li, Cheng Zhang, Yu Yang, Yachao Yang","doi":"10.1145/3446132.3446161","DOIUrl":null,"url":null,"abstract":"Exact attitude estimation of an unmanned aerial vehicle (UAV) is the critical requirement for an autonomous and stable flight. Under dynamic conditions, the autopilot of a UAV cannot complete self-alignment and calculate its accurate attitude only by relying on an inertial measurement unit (IMU). This paper presents a dynamical alignment and estimation method for the horizontal attitude (the pitch and roll Euler angle) based on vision and inertial measurement units (IMU). Firstly, the horizontal attitude is estimated through image information of a visible light camera (hereinafter referred to as visual pose), and is used as the initial alignment input for IMU. Then the visual pose is corrected according to the normalized accelerometer output when UAV is stationary. Finally, Sage-Husa Adaptive Kalman Filter (SHAKF) is used for the fusion of the visual pose and the inertial attitude (the attitude calculated by the IMU). The simulation results show that the maximum estimation error of the UAV's horizontal attitude is within 3°, and the average estimation absolute error is less than 1°, which verifies the effectiveness of this method.","PeriodicalId":125388,"journal":{"name":"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence","volume":"5 ","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Dynamical Alignment and Estimation for Horizontal Attitude of UAV Based on Vision and IMU.\",\"authors\":\"Xueyong Wu, Jie Li, Cheng Zhang, Yu Yang, Yachao Yang\",\"doi\":\"10.1145/3446132.3446161\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Exact attitude estimation of an unmanned aerial vehicle (UAV) is the critical requirement for an autonomous and stable flight. Under dynamic conditions, the autopilot of a UAV cannot complete self-alignment and calculate its accurate attitude only by relying on an inertial measurement unit (IMU). This paper presents a dynamical alignment and estimation method for the horizontal attitude (the pitch and roll Euler angle) based on vision and inertial measurement units (IMU). Firstly, the horizontal attitude is estimated through image information of a visible light camera (hereinafter referred to as visual pose), and is used as the initial alignment input for IMU. Then the visual pose is corrected according to the normalized accelerometer output when UAV is stationary. Finally, Sage-Husa Adaptive Kalman Filter (SHAKF) is used for the fusion of the visual pose and the inertial attitude (the attitude calculated by the IMU). The simulation results show that the maximum estimation error of the UAV's horizontal attitude is within 3°, and the average estimation absolute error is less than 1°, which verifies the effectiveness of this method.\",\"PeriodicalId\":125388,\"journal\":{\"name\":\"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence\",\"volume\":\"5 \",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3446132.3446161\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3446132.3446161","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Dynamical Alignment and Estimation for Horizontal Attitude of UAV Based on Vision and IMU.
Exact attitude estimation of an unmanned aerial vehicle (UAV) is the critical requirement for an autonomous and stable flight. Under dynamic conditions, the autopilot of a UAV cannot complete self-alignment and calculate its accurate attitude only by relying on an inertial measurement unit (IMU). This paper presents a dynamical alignment and estimation method for the horizontal attitude (the pitch and roll Euler angle) based on vision and inertial measurement units (IMU). Firstly, the horizontal attitude is estimated through image information of a visible light camera (hereinafter referred to as visual pose), and is used as the initial alignment input for IMU. Then the visual pose is corrected according to the normalized accelerometer output when UAV is stationary. Finally, Sage-Husa Adaptive Kalman Filter (SHAKF) is used for the fusion of the visual pose and the inertial attitude (the attitude calculated by the IMU). The simulation results show that the maximum estimation error of the UAV's horizontal attitude is within 3°, and the average estimation absolute error is less than 1°, which verifies the effectiveness of this method.