{"title":"合成视觉系统的传感器融合方法","authors":"D. Allerton, A.J. Clare","doi":"10.1109/DASC.2004.1391310","DOIUrl":null,"url":null,"abstract":"A millimetric radar imaging sensor can project a forward-looking view in a head-up display (HUD) to provide enhanced vision in the final stages of an approach, particularly in conditions of very low visibility. Although this increases situational awareness for the flight crew, the image quality is poor and there is no direct measure of system integrity. This paper describes a synthetic vision system using real-time image feature extraction to detect the runway in the image. This information is combined with knowledge of the aircraft position and attitude to provide flight guidance cues and to monitor the aircraft flight path. In the initial phase of the approach, GPS measurements are used to align the inertial reference system. During the final stages of an approach, inertial reference measurements are combined with imaging data to locate the vertices of the runway. Sensor fusion methods are used to provide flight guidance cues in the HUD and to determine system integrity measurements of the imaging system. A synthetic vision system overlays the computed runway position on the cluttered radar image and displays essential flight data. The paper outlines a radar model of the sensor, which runs on a PC-based visual system. This model has been used to provide a realistic real-time radar image during development of the tracking algorithms. The inertial reference system and the tracking system are also modeled and combined in an extended Kalman filter to provide flight guidance and to give timely warning of system failures to the flight crew. The paper describes the sensor fusion method developed for failure detection and provides examples of low visibility approaches flown in a flight simulator, to demonstrate the effectiveness of these techniques.","PeriodicalId":422463,"journal":{"name":"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Sensor fusion methods for synthetic vision systems\",\"authors\":\"D. Allerton, A.J. Clare\",\"doi\":\"10.1109/DASC.2004.1391310\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A millimetric radar imaging sensor can project a forward-looking view in a head-up display (HUD) to provide enhanced vision in the final stages of an approach, particularly in conditions of very low visibility. Although this increases situational awareness for the flight crew, the image quality is poor and there is no direct measure of system integrity. This paper describes a synthetic vision system using real-time image feature extraction to detect the runway in the image. This information is combined with knowledge of the aircraft position and attitude to provide flight guidance cues and to monitor the aircraft flight path. In the initial phase of the approach, GPS measurements are used to align the inertial reference system. During the final stages of an approach, inertial reference measurements are combined with imaging data to locate the vertices of the runway. Sensor fusion methods are used to provide flight guidance cues in the HUD and to determine system integrity measurements of the imaging system. A synthetic vision system overlays the computed runway position on the cluttered radar image and displays essential flight data. The paper outlines a radar model of the sensor, which runs on a PC-based visual system. This model has been used to provide a realistic real-time radar image during development of the tracking algorithms. The inertial reference system and the tracking system are also modeled and combined in an extended Kalman filter to provide flight guidance and to give timely warning of system failures to the flight crew. The paper describes the sensor fusion method developed for failure detection and provides examples of low visibility approaches flown in a flight simulator, to demonstrate the effectiveness of these techniques.\",\"PeriodicalId\":422463,\"journal\":{\"name\":\"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2004-10-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DASC.2004.1391310\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DASC.2004.1391310","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sensor fusion methods for synthetic vision systems
A millimetric radar imaging sensor can project a forward-looking view in a head-up display (HUD) to provide enhanced vision in the final stages of an approach, particularly in conditions of very low visibility. Although this increases situational awareness for the flight crew, the image quality is poor and there is no direct measure of system integrity. This paper describes a synthetic vision system using real-time image feature extraction to detect the runway in the image. This information is combined with knowledge of the aircraft position and attitude to provide flight guidance cues and to monitor the aircraft flight path. In the initial phase of the approach, GPS measurements are used to align the inertial reference system. During the final stages of an approach, inertial reference measurements are combined with imaging data to locate the vertices of the runway. Sensor fusion methods are used to provide flight guidance cues in the HUD and to determine system integrity measurements of the imaging system. A synthetic vision system overlays the computed runway position on the cluttered radar image and displays essential flight data. The paper outlines a radar model of the sensor, which runs on a PC-based visual system. This model has been used to provide a realistic real-time radar image during development of the tracking algorithms. The inertial reference system and the tracking system are also modeled and combined in an extended Kalman filter to provide flight guidance and to give timely warning of system failures to the flight crew. The paper describes the sensor fusion method developed for failure detection and provides examples of low visibility approaches flown in a flight simulator, to demonstrate the effectiveness of these techniques.