L. E. G. D. Vasconcelos, A. Y. Kusumoto, N. Leite, C. M. A. Lopes
{"title":"Automated Extraction Information System from HUDs Images Using ANN","authors":"L. E. G. D. Vasconcelos, A. Y. Kusumoto, N. Leite, C. M. A. Lopes","doi":"10.1109/ITNG.2015.110","DOIUrl":null,"url":null,"abstract":"In this paper, the recognition information in aircraft images of Head-Up Display (HUD) was made using artificial neural network (ANN) and a correlation algorithm. During the flight tests, the images displayed on the HUD could be stored for later analysis. HUD images presents many aircraft data provided by its avionics system (e.g. Altitude, feet, time). Therefore, HUD images are a primary source of information for most aircraft and pilots, especially in military missions. At IPEV (Flight Test\" a Research Institute), the extraction of information from HUD images is performed manually, frame by frame, for later analysis. The big issue is that in one hour of flight test about 36,000 frames are generated. Therefore, data extraction becomes complex, time consuming and prone to failures. To reduce these problems, the IPEV developed an algorithm that load HUD images and then partitions the images in regions that were classified, recognized and converted into text by using ANN and a correlation algorithm. The development of the algorithm is presented in this paper.","PeriodicalId":89615,"journal":{"name":"Proceedings of the ... International Conference on Information Technology: New Generations. International Conference on Information Technology: New Generations","volume":"52 1","pages":"657-661"},"PeriodicalIF":0.0000,"publicationDate":"2015-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... International Conference on Information Technology: New Generations. International Conference on Information Technology: New Generations","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITNG.2015.110","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, the recognition information in aircraft images of Head-Up Display (HUD) was made using artificial neural network (ANN) and a correlation algorithm. During the flight tests, the images displayed on the HUD could be stored for later analysis. HUD images presents many aircraft data provided by its avionics system (e.g. Altitude, feet, time). Therefore, HUD images are a primary source of information for most aircraft and pilots, especially in military missions. At IPEV (Flight Test" a Research Institute), the extraction of information from HUD images is performed manually, frame by frame, for later analysis. The big issue is that in one hour of flight test about 36,000 frames are generated. Therefore, data extraction becomes complex, time consuming and prone to failures. To reduce these problems, the IPEV developed an algorithm that load HUD images and then partitions the images in regions that were classified, recognized and converted into text by using ANN and a correlation algorithm. The development of the algorithm is presented in this paper.