Marceau Bamond, N. Hueber, G. Strub, S. Changey, Jonathan Weber
{"title":"Visual homing guidance for projectiles using event-cameras","authors":"Marceau Bamond, N. Hueber, G. Strub, S. Changey, Jonathan Weber","doi":"10.1117/12.2638477","DOIUrl":null,"url":null,"abstract":"Compared to frame-based visual streams, event-driven visual streams offer very low bandwidth needs and high temporal resolution, making them an interesting choice for embedded object recognition. Such visual systems are seen to overcome standard cameras performances but have not yet been studied in the frame of Homing Guidance for projectiles, with drastic navigation constraints. This work starts from a first interaction model between a standard camera and an event camera, validated in the context of unattended ground sensors and situational awareness applications from a static position. In this paper we propose to extend this first interaction model by bringing a higher-level activity analysis and object recognition from a moving position. The proposed event-based terminal guidance system is studied firstly through a target laser designation scenario and the optical flow computation to validate guidance parameters. Real-time embedded processing techniques are evaluated, preparing the design of a future demonstrator of a very fast navigation system. The first results have been obtained using embedded Linux architectures with multi-threaded features extractions. This paper shows and comments these first results.","PeriodicalId":52940,"journal":{"name":"Security and Defence Quarterly","volume":"72 1","pages":"122740H - 122740H-9"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Security and Defence Quarterly","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2638477","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Compared to frame-based visual streams, event-driven visual streams offer very low bandwidth needs and high temporal resolution, making them an interesting choice for embedded object recognition. Such visual systems are seen to overcome standard cameras performances but have not yet been studied in the frame of Homing Guidance for projectiles, with drastic navigation constraints. This work starts from a first interaction model between a standard camera and an event camera, validated in the context of unattended ground sensors and situational awareness applications from a static position. In this paper we propose to extend this first interaction model by bringing a higher-level activity analysis and object recognition from a moving position. The proposed event-based terminal guidance system is studied firstly through a target laser designation scenario and the optical flow computation to validate guidance parameters. Real-time embedded processing techniques are evaluated, preparing the design of a future demonstrator of a very fast navigation system. The first results have been obtained using embedded Linux architectures with multi-threaded features extractions. This paper shows and comments these first results.