合成视觉系统的传感器融合方法

D. Allerton, A.J. Clare
{"title":"合成视觉系统的传感器融合方法","authors":"D. Allerton, A.J. Clare","doi":"10.1109/DASC.2004.1391310","DOIUrl":null,"url":null,"abstract":"A millimetric radar imaging sensor can project a forward-looking view in a head-up display (HUD) to provide enhanced vision in the final stages of an approach, particularly in conditions of very low visibility. Although this increases situational awareness for the flight crew, the image quality is poor and there is no direct measure of system integrity. This paper describes a synthetic vision system using real-time image feature extraction to detect the runway in the image. This information is combined with knowledge of the aircraft position and attitude to provide flight guidance cues and to monitor the aircraft flight path. In the initial phase of the approach, GPS measurements are used to align the inertial reference system. During the final stages of an approach, inertial reference measurements are combined with imaging data to locate the vertices of the runway. Sensor fusion methods are used to provide flight guidance cues in the HUD and to determine system integrity measurements of the imaging system. A synthetic vision system overlays the computed runway position on the cluttered radar image and displays essential flight data. The paper outlines a radar model of the sensor, which runs on a PC-based visual system. This model has been used to provide a realistic real-time radar image during development of the tracking algorithms. The inertial reference system and the tracking system are also modeled and combined in an extended Kalman filter to provide flight guidance and to give timely warning of system failures to the flight crew. The paper describes the sensor fusion method developed for failure detection and provides examples of low visibility approaches flown in a flight simulator, to demonstrate the effectiveness of these techniques.","PeriodicalId":422463,"journal":{"name":"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Sensor fusion methods for synthetic vision systems\",\"authors\":\"D. Allerton, A.J. Clare\",\"doi\":\"10.1109/DASC.2004.1391310\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A millimetric radar imaging sensor can project a forward-looking view in a head-up display (HUD) to provide enhanced vision in the final stages of an approach, particularly in conditions of very low visibility. Although this increases situational awareness for the flight crew, the image quality is poor and there is no direct measure of system integrity. This paper describes a synthetic vision system using real-time image feature extraction to detect the runway in the image. This information is combined with knowledge of the aircraft position and attitude to provide flight guidance cues and to monitor the aircraft flight path. In the initial phase of the approach, GPS measurements are used to align the inertial reference system. During the final stages of an approach, inertial reference measurements are combined with imaging data to locate the vertices of the runway. Sensor fusion methods are used to provide flight guidance cues in the HUD and to determine system integrity measurements of the imaging system. A synthetic vision system overlays the computed runway position on the cluttered radar image and displays essential flight data. The paper outlines a radar model of the sensor, which runs on a PC-based visual system. This model has been used to provide a realistic real-time radar image during development of the tracking algorithms. The inertial reference system and the tracking system are also modeled and combined in an extended Kalman filter to provide flight guidance and to give timely warning of system failures to the flight crew. The paper describes the sensor fusion method developed for failure detection and provides examples of low visibility approaches flown in a flight simulator, to demonstrate the effectiveness of these techniques.\",\"PeriodicalId\":422463,\"journal\":{\"name\":\"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2004-10-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DASC.2004.1391310\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DASC.2004.1391310","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

毫米雷达成像传感器可以在平视显示器(HUD)上投射前视视图,在接近的最后阶段提供增强的视觉,特别是在能见度非常低的情况下。尽管这增加了机组人员的态势感知能力,但图像质量很差,并且没有直接测量系统完整性的方法。本文介绍了一种利用实时图像特征提取来检测图像中跑道的合成视觉系统。这些信息与飞机位置和姿态的知识相结合,提供飞行引导线索并监视飞机的飞行路径。在该方法的初始阶段,使用GPS测量来对准惯性参考系统。在进近的最后阶段,惯性参考测量与成像数据相结合,以定位跑道的顶点。传感器融合方法用于在HUD中提供飞行制导提示,并确定成像系统的系统完整性测量。合成视觉系统将计算出的跑道位置覆盖在混乱的雷达图像上,并显示基本的飞行数据。本文概述了该传感器的雷达模型,该模型运行在基于pc机的视觉系统上。在跟踪算法的开发过程中,该模型已被用于提供真实的实时雷达图像。惯性参考系统和跟踪系统也被建模并结合在一个扩展的卡尔曼滤波器中,以提供飞行制导并及时向机组人员发出系统故障警告。本文介绍了用于故障检测的传感器融合方法,并提供了在飞行模拟器中飞行的低能见度进近的例子,以证明这些技术的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Sensor fusion methods for synthetic vision systems
A millimetric radar imaging sensor can project a forward-looking view in a head-up display (HUD) to provide enhanced vision in the final stages of an approach, particularly in conditions of very low visibility. Although this increases situational awareness for the flight crew, the image quality is poor and there is no direct measure of system integrity. This paper describes a synthetic vision system using real-time image feature extraction to detect the runway in the image. This information is combined with knowledge of the aircraft position and attitude to provide flight guidance cues and to monitor the aircraft flight path. In the initial phase of the approach, GPS measurements are used to align the inertial reference system. During the final stages of an approach, inertial reference measurements are combined with imaging data to locate the vertices of the runway. Sensor fusion methods are used to provide flight guidance cues in the HUD and to determine system integrity measurements of the imaging system. A synthetic vision system overlays the computed runway position on the cluttered radar image and displays essential flight data. The paper outlines a radar model of the sensor, which runs on a PC-based visual system. This model has been used to provide a realistic real-time radar image during development of the tracking algorithms. The inertial reference system and the tracking system are also modeled and combined in an extended Kalman filter to provide flight guidance and to give timely warning of system failures to the flight crew. The paper describes the sensor fusion method developed for failure detection and provides examples of low visibility approaches flown in a flight simulator, to demonstrate the effectiveness of these techniques.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信