{"title":"Radar Jamming Recognition Method Based on Cross-Modal Multilevel Feature Fusion","authors":"Mingyu Wu, Mingjun Huang, Hao Wu, Kai Xie","doi":"10.1049/rsn2.70111","DOIUrl":null,"url":null,"abstract":"<p>Effective radar jamming recognition is a critical precondition for enhancing radar antijamming capabilities. Although deep neural networks have been widely adopted for this task, existing methods mainly rely on time-frequency (TF) maps, overlooking inherent signal features such as amplitude and phase. This incomplete representation leads to a significant decline in recognition accuracy under low jamming-to-noise ratio (JNR) and complex interference conditions. To address these challenges, we propose a cross-modal multilevel feature fusion network (CM-FF), which innovatively integrates one-dimensional signal tensors, spectrum and two-dimensional TF images to compensate for information loss in single-modal approaches, significantly enhancing feature separability and identification accuracy. A multilevel feature extraction module is proposed to extract multiscale features from both one-dimensional (1D) tensors and two-dimensional (2D) images. Besides, a multimodal feature fusion module is proposed to assign weights to different features adaptively. Experimental results show that our proposed method achieves a recognition accuracy of 98.4%, representing a maximum improvement of 14.6% over existing methods. Even under extremely low JNR conditions of −10 dB, our network maintains an accuracy rate of 80.75%. Furthermore, the network has fewer than 1 million parameters, demonstrating its lightweight design and low resource requirements.</p>","PeriodicalId":50377,"journal":{"name":"Iet Radar Sonar and Navigation","volume":"20 1","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2026-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/rsn2.70111","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Iet Radar Sonar and Navigation","FirstCategoryId":"94","ListUrlMain":"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/rsn2.70111","RegionNum":4,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Effective radar jamming recognition is a critical precondition for enhancing radar antijamming capabilities. Although deep neural networks have been widely adopted for this task, existing methods mainly rely on time-frequency (TF) maps, overlooking inherent signal features such as amplitude and phase. This incomplete representation leads to a significant decline in recognition accuracy under low jamming-to-noise ratio (JNR) and complex interference conditions. To address these challenges, we propose a cross-modal multilevel feature fusion network (CM-FF), which innovatively integrates one-dimensional signal tensors, spectrum and two-dimensional TF images to compensate for information loss in single-modal approaches, significantly enhancing feature separability and identification accuracy. A multilevel feature extraction module is proposed to extract multiscale features from both one-dimensional (1D) tensors and two-dimensional (2D) images. Besides, a multimodal feature fusion module is proposed to assign weights to different features adaptively. Experimental results show that our proposed method achieves a recognition accuracy of 98.4%, representing a maximum improvement of 14.6% over existing methods. Even under extremely low JNR conditions of −10 dB, our network maintains an accuracy rate of 80.75%. Furthermore, the network has fewer than 1 million parameters, demonstrating its lightweight design and low resource requirements.
期刊介绍:
IET Radar, Sonar & Navigation covers the theory and practice of systems and signals for radar, sonar, radiolocation, navigation, and surveillance purposes, in aerospace and terrestrial applications.
Examples include advances in waveform design, clutter and detection, electronic warfare, adaptive array and superresolution methods, tracking algorithms, synthetic aperture, and target recognition techniques.