E. Itcia, J. Wasselin, S. Mazuel, M. Otten, A. Huizing
{"title":"FMCW radar for the sense function of sense and avoid systems onboard UAVs","authors":"E. Itcia, J. Wasselin, S. Mazuel, M. Otten, A. Huizing","doi":"10.1117/12.2028518","DOIUrl":"https://doi.org/10.1117/12.2028518","url":null,"abstract":"Rockwell Collins France (RCF) radar department is currently developing, in close collaboration with TNO in The Hague, The Netherlands, a Frequency Modulated Continuous Wave (FMCW) radar sensor dedicated to Obstacle Warning function and potentially to air traffic detection. The sensor combines flood light illumination and digital beam forming to accommodate demanding detection and coverage requirements. Performances have been evaluated in flight tests and results prove that such a radar sensor is a good candidate for the Sense Function of Sense and Avoid Systems onboard UAV.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125597094","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Breckon, A. Gaszczak, Jiwan Han, M. Eichner, Stuart Barnes
{"title":"Multi-modal target detection for autonomous wide area search and surveillance","authors":"T. Breckon, A. Gaszczak, Jiwan Han, M. Eichner, Stuart Barnes","doi":"10.1117/12.2028340","DOIUrl":"https://doi.org/10.1117/12.2028340","url":null,"abstract":"Generalised wide are search and surveillance is a common-place tasking for multi-sensory equipped autonomous systems. Here we present on a key supporting topic to this task - the automatic interpretation, fusion and detected target reporting from multi-modal sensor information received from multiple autonomous platforms deployed for wide-area environment search. We detail the realization of a real-time methodology for the automated detection of people and vehicles using combined visible-band (EO), thermal-band (IR) and radar sensing from a deployed network of multiple autonomous platforms (ground and aerial). This facilities real-time target detection, reported with varying levels of confidence, using information from both multiple sensors and multiple sensor platforms to provide environment-wide situational awareness. A range of automatic classification approaches are proposed, driven by underlying machine learning techniques, that facilitate the automatic detection of either target type with cross-modal target confirmation. Extended results are presented that show both the detection of people and vehicles under varying conditions in both isolated rural and cluttered urban environments with minimal false positive detection. Performance evaluation is presented at an episodic level with individual classifiers optimized for maximal each object of interest (vehicle/person) detection over a given search path/pattern of the environment, across all sensors and modalities, rather than on a per sensor sample basis. Episodic target detection, evaluated over a number of wide-area environment search and reporting tasks, generally exceeds 90%+ for the targets considered here.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"137 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127378863","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Management of unmanned moving sensors through human decision layers: a bi-level optimization process with calls to costly sub-processes","authors":"F. Dambreville","doi":"10.1117/12.2032071","DOIUrl":"https://doi.org/10.1117/12.2032071","url":null,"abstract":"While there is a variety of approaches and algorithms for optimizing the mission of an unmanned moving sensor, there are much less works which deal with the implementation of several sensors within a human organization. In this case, the management of the sensors is done through at least one human decision layer, and the sensors management as a whole arises as a bi-level optimization process. In this work, the following hypotheses are considered as realistic: Sensor handlers of first level plans their sensors by means of elaborated algorithmic tools based on accurate modelling of the environment; Higher level plans the handled sensors according to a global observation mission and on the basis of an approximated model of the environment and of the first level sub-processes. This problem is formalized very generally as the maximization of an unknown function, defined a priori by sampling a known random function (law of model error). In such case, each actual evaluation of the function increases the knowledge about the function, and subsequently the efficiency of the maximization. The issue is to optimize the sequence of value to be evaluated, in regards to the evaluation costs. There is here a fundamental link with the domain of experiment design. Jones, Schonlau and Welch proposed a general method, the Efficient Global Optimization (EGO), for solving this problem in the case of additive functional Gaussian law. In our work, a generalization of the EGO is proposed, based on a rare event simulation approach. It is applied to the aforementioned bi-level sensor planning.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123606506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Meeting performance and sensing-cost requirements for detection and recognition systems","authors":"C. Willis","doi":"10.1117/12.2027582","DOIUrl":"https://doi.org/10.1117/12.2027582","url":null,"abstract":"The modelling of the Automatic Target Detection, Recognition and Identification performance in systems of multiple sensors and/or platforms is important in many respects. For example, in the selection of sensors or sensor combinations of sufficient effectiveness to achieve operational requirements, or for understanding how the system might be best exploited. It is possible that a sensing system may be comprised of sensors of several different types, including active and passive approaches in the radio frequency and optical portions of the spectrum. Some may have well-understood performance, whereas others may be only poorly characterised. A simulation framework has been developed examining sensor options across different sensor types, parameterisations, search strategies, and applications. The framework is based around Bayesian Decision Theoretic principles along with simple sensor models and search environment. It uses Monte-Carlo simulation to derive statistical measures of performance for systems. The framework has been designed to encompass detection, recognition and identification problems and also to treat sensor characterisation. The modelling framework has been applied to a number of illustrative problems. These range from simple target detection scenarios using sensors of differing performance or of different regional search schemes, through to examinations of: the number of measurements required to reach threshold performance; the effects of sensor measurement cost; issues relating to the poor characterisation of sensors within the system, and; the performance of combined detection and recognition sensor systems. Results are presented illustrating these effects. These generally show that the method is able to quantify qualitative expectations of performance, and is sufficiently powerful to highlight some unexpected aspects of operation.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125317879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Dimmeler, H. Schilling, M. Shimoni, D. Bulatov, W. Middelmann
{"title":"Combined airborne sensors in urban environment","authors":"A. Dimmeler, H. Schilling, M. Shimoni, D. Bulatov, W. Middelmann","doi":"10.1117/12.2028648","DOIUrl":"https://doi.org/10.1117/12.2028648","url":null,"abstract":"Military operations in urban areas became more relevant in the past decades. Detailed situation awareness in these complex environments is crucial for successful operations. Within the EDA (European Defence Agency) project on “Detection in Urban scenario using Combined Airborne imaging Sensors” (DUCAS) an extensive data set of hyperspectral and high spatial resolution data as well as three dimensional (3D) laser data was generated in a common field trial in the city of Zeebrugge, Belgium, in the year 2011. In the frame of DUCAS, methods were developed at two levels of processing. In the first level, single sensor data were used for land cover mapping and the detection of targets of interest (i.e. personnel, vehicles and objects). In the second level, data fusion was applied at pixel level as well as information level to investigate the benefits of combining sensor systems in an operational context. Providing data for mission planning and mapping is an important task for aerial reconnaissance and it includes the creation or the update of high quality 2D and 3D maps. In DUCAS, semi-automatic methods and a wide range of sensor data (hyperspectral, LIDAR, high resolution orthophotos and video data) were used for the creation of highly detailed land cover maps as well as urban terrain models. Combining the diverse information gained by different sensors increases the information content and the quality of the extracted information. In this paper we will present advanced methods for the creation of 2D/3D maps, show results and the benefit of fusing multi-sensor data.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121772959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Super-resolution imaging applied to moving targets in high dynamics scenes","authors":"Olegs Mise, T. Breckon","doi":"10.1117/12.2028743","DOIUrl":"https://doi.org/10.1117/12.2028743","url":null,"abstract":"In modern tracking systems the ability to obtain high quality, high resolution appearance of the tracked target is often highly desirable. However, the reality of operational deployment often means that imaging systems deployed for this task suffer from limitations reducing effective image quality. These limitations can be attributed to a range of causes such as low quality video sensors, system noise, high target dynamics and other environmental noise factors. Despite the advantages of the super-resolution techniques the problem of handling complex motion still remains a challenging task for the effective super-resolution implementation. The computational complexity and large memory requirements required for the implementation of super-resolution imaging largely restrict the usage of these techniques in real-time hardware implementations. In order to improve visual quality of the tracked target and overcome these limitations, we propose a simple yet effective solution that integrates a super-resolution imaging approach based on combination of the Sum of the Absolut Differences (SAD) and gradient-descent motion estimation techniques into a novel tracking approach. In addition, the proposed approach demonstrates robustness in improved target appearance modeling that assists the overall tracking system. The presented results demonstrate this significant improvement in visual target representation whilst tracking over high dynamic scenes. The implementation simplicity of the proposed approach makes it an attractive solution for realization on low power hardware. Such a system can be deployed on small unmanned aerial vehicles (UAV) or other hardware where size, weight and power (SWaP) is of a particular concern.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134637139","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Kudenov, S. Mallik, M. Escuti, N. Hagen, K. Oka, E. Dereniak
{"title":"Snapshot imaging Mueller matrix instrument","authors":"M. Kudenov, S. Mallik, M. Escuti, N. Hagen, K. Oka, E. Dereniak","doi":"10.1117/12.2028546","DOIUrl":"https://doi.org/10.1117/12.2028546","url":null,"abstract":"A novel way to measure the Mueller matrix image enables a sample's diattenuation, retardance, and depolarization to be measured within a single camera integration period. Since the Mueller matrix components are modulated onto coincident carrier frequencies, the described technique provides unique solutions to image registration problems for moving objects. In this paper, a snapshot imaging Mueller matrix polarimeter is theoretically described, and preliminary results shows it to be a viable approach for use in surface characterization of moving objects.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"128 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121378161","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. Heimann, M. Liesegang, Norbert Arndt-Staufenbiel, H. Schröder, K. Lang
{"title":"Optical system components for navigation grade fiber optic gyroscopes","authors":"M. Heimann, M. Liesegang, Norbert Arndt-Staufenbiel, H. Schröder, K. Lang","doi":"10.1117/12.2029097","DOIUrl":"https://doi.org/10.1117/12.2029097","url":null,"abstract":"Interferometric fiber optic gyroscopes belong to the class of inertial sensors. Due to their high accuracy they are used for absolute position and rotation measurement in manned/unmanned vehicles, e.g. submarines, ground vehicles, aircraft or satellites. The important system components are the light source, the electro optical phase modulator, the optical fiber coil and the photodetector. This paper is focused on approaches to realize a stable light source and fiber coil. Superluminescent diode and erbium doped fiber laser were studied to realize an accurate and stable light source. Therefor the influence of the polarization grade of the source and the effects due to back reflections to the source were studied. During operation thermal working conditions severely affect accuracy and stability of the optical fiber coil, which is the sensor element. Thermal gradients that are applied to the fiber coil have large negative effects on the achievable system accuracy of the optic gyroscope. Therefore a way of calculating and compensating the rotation rate error of a fiber coil due to thermal change is introduced. A simplified 3 dimensional FEM of a quadrupole wound fiber coil is used to determine the build-up of thermal fields in the polarization maintaining fiber due to outside heating sources. The rotation rate error due to these sources is then calculated and compared to measurement data. A simple regression model is used to compensate the rotation rate error with temperature measurement at the outside of the fiber coil. To realize a compact and robust optical package for some of the relevant optical system components an approach based on ion exchanged waveguides in thin glass was developed. This waveguides are used to realize 1x2 and 1x4 splitter with fiber coupling interface or direct photodiode coupling.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"205 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122413341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An object-oriented modeling and simulation framework for bearings-only multi-target tracking using an unattended acoustic sensor network","authors":"M. Aslan","doi":"10.1117/12.2029345","DOIUrl":"https://doi.org/10.1117/12.2029345","url":null,"abstract":"Tracking ground targets using low cost ground-based sensors is a challenging field because of the limited capabilities of such sensors. Among the several candidates, including seismic and magnetic sensors, the acoustic sensors based on microphone arrays have a potential of being useful: They can provide a direction to the sound source, they can have a relatively better range, and the sound characteristics can provide a basis for target classification. However, there are still many problems. One of them is the difficulty to resolve multiple sound sources, another is that they do not provide distance, a third is the presence of background noise from wind, sea, rain, distant air and land traffic, people, etc., and a fourth is that the same target can sound very differently depending on factors like terrain type, topography, speed, gear, distance, etc. Use of sophisticated signal processing and data fusion algorithms is the key for compensating (to an extend) the limited capabilities and mentioned problems of these sensors. It is hard, if not impossible, to evaluate the performance of such complex algorithms analytically. For an effective evaluation, before performing expensive field trials, well-designed laboratory experiments and computer simulations are necessary. Along this line, in this paper, we present an object-oriented modeling and simulation framework which can be used to generate simulated data for the data fusion algorithms for tracking multiple on-road targets in an unattended acoustic sensor network. Each sensor node in the network is a circular microphone array which produces the direction of arrival (DOA) (or bearing) measurements of the targets and sends this information to a fusion center. We present the models for road networks, targets (motion and acoustic power) and acoustic sensors in an object-oriented fashion where different and possibly time-varying sampling periods for each sensor node is possible. Moreover, the sensor’s signal processing and detection blocks are modeled using a parametric approach by associating a receiver operating characteristics (ROC) curve to the whole process, which results in false alarms as well as missed detections. The proposed simulation environment can be used for ground-truth and synthetic data generation for road-constraint multiple target tracking in an unattended acoustic sensor network.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121921945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"High precision object geo-localization and visualization in sensor networks","authors":"S. Lemaire, C. Bodensteiner, Michael Arens","doi":"10.1117/12.2028734","DOIUrl":"https://doi.org/10.1117/12.2028734","url":null,"abstract":"The wide availability of previously acquired, geo-referenced imagery enables automatic video based solutions for high precision object geo-localization and cooperative visualization. We present a system which geo-references objects seen in UAV video streams, distributes this information in a sensor network and visualizes them on modern smartphones using augmented reality techniques. The feasibility of the approach was experimentally validated using Mini-UAV (\"MD-400\") and high altitude UAV video footage in combination with modern off-the-shelve smartphones. Applications are widespread and include for instance crisis and disaster management or military applications.","PeriodicalId":344928,"journal":{"name":"Optics/Photonics in Security and Defence","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127050495","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}