J. Valasek, K. Kirkpatrick, James May, Joshua Harris
{"title":"Intelligent Motion Video Guidance for Unmanned Air System Ground Target Surveillance","authors":"J. Valasek, K. Kirkpatrick, James May, Joshua Harris","doi":"10.2514/1.I010198","DOIUrl":null,"url":null,"abstract":"Unmanned air systems with video capturing systems for surveillance and visual tracking of ground targets have worked relatively well when employing gimbaled cameras controlled by two or more operators: one to fly the vehicle, and one to orient the camera and visually track ground targets. However, autonomous operation to reduce operator workload and crew levels is more challenging when the camera is strapdown, or fixed to the airframe without a pan-and-tilt capability, rather than gimbaled, so that the vehicle must be steered to orient the camera field of view. Visual tracking becomes even more difficult when the target follows an unpredictable path. This paper investigates a machine learning algorithm for visual tracking of stationary and moving ground targets by unmanned air systems with nongimbaling, fixed pan-and-tilt cameras. The algorithm is based on Q learning, and the learning agent initially determines an offline control policy for vehicle orientation and flight path such that a target can be tra...","PeriodicalId":179117,"journal":{"name":"J. Aerosp. Inf. Syst.","volume":"55 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-01-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Aerosp. Inf. Syst.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2514/1.I010198","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 18
Abstract
Unmanned air systems with video capturing systems for surveillance and visual tracking of ground targets have worked relatively well when employing gimbaled cameras controlled by two or more operators: one to fly the vehicle, and one to orient the camera and visually track ground targets. However, autonomous operation to reduce operator workload and crew levels is more challenging when the camera is strapdown, or fixed to the airframe without a pan-and-tilt capability, rather than gimbaled, so that the vehicle must be steered to orient the camera field of view. Visual tracking becomes even more difficult when the target follows an unpredictable path. This paper investigates a machine learning algorithm for visual tracking of stationary and moving ground targets by unmanned air systems with nongimbaling, fixed pan-and-tilt cameras. The algorithm is based on Q learning, and the learning agent initially determines an offline control policy for vehicle orientation and flight path such that a target can be tra...