{"title":"A moving ISAR-object recognition using pi-sigma neural networks based on histogram of oriented gradient of edge","authors":"Asma Elyounsi, H. Tlijani, M. Bouhlel","doi":"10.1080/19479832.2021.1953620","DOIUrl":null,"url":null,"abstract":"ABSTRACT Detection and classification with traditional neural networks methods such as multilayer perceptron (MLP), feed forward network and back propagation neural networks show several drawbacks including the rate of convergence and the incapacity facing the problems of size of the image especially for radar images. As a result, these methods are being replaced by other evolutional classification methods such as Higher Order Neural Networks (HONN) (Functional Link Artificial Neural Network (FLANN), Pi Sigma Neural Network (PSNN), Neural Network Product Unit (PUNN) and Neural Network of the Higher Order Processing Unit. So, in this paper, we address radar object detection and classification problems with a new strategy by using PSNN and a new proposed method HOGE for edges features extraction based on morphological operators and histogram of oriented gradient. Thus, in order to recognise radar object, we extract HOG features of the object region and classify our target with PSNN. The HOGE features vector is used as input of pi-sigma NN. The proposed method was tested and confirmed based on experiments through the use of 2D and 3D ISAR images.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":"13 1","pages":"297 - 315"},"PeriodicalIF":1.8000,"publicationDate":"2021-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2021.1953620","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 3
Abstract
ABSTRACT Detection and classification with traditional neural networks methods such as multilayer perceptron (MLP), feed forward network and back propagation neural networks show several drawbacks including the rate of convergence and the incapacity facing the problems of size of the image especially for radar images. As a result, these methods are being replaced by other evolutional classification methods such as Higher Order Neural Networks (HONN) (Functional Link Artificial Neural Network (FLANN), Pi Sigma Neural Network (PSNN), Neural Network Product Unit (PUNN) and Neural Network of the Higher Order Processing Unit. So, in this paper, we address radar object detection and classification problems with a new strategy by using PSNN and a new proposed method HOGE for edges features extraction based on morphological operators and histogram of oriented gradient. Thus, in order to recognise radar object, we extract HOG features of the object region and classify our target with PSNN. The HOGE features vector is used as input of pi-sigma NN. The proposed method was tested and confirmed based on experiments through the use of 2D and 3D ISAR images.
期刊介绍:
International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).