Stefan Hägele;Fabian Seguel;Sabri Mustafa Kahya;Eckehard Steinbach
{"title":"Occluded Object Classification With mmWave MIMO Radar IQ Signals Using Dual-Stream Convolutional Neural Networks","authors":"Stefan Hägele;Fabian Seguel;Sabri Mustafa Kahya;Eckehard Steinbach","doi":"10.1109/TRS.2025.3571284","DOIUrl":null,"url":null,"abstract":"The ability of millimeter-wave (mmWave) radar to penetrate lightweight materials and provide nonvisual insights into obscured areas represents a significant advantage over camera or LiDAR sensors. This capability enables mmWave radar to detect humans behind thin walls or identify occluded objects stored within luggage or packages. The latter capability is particularly valuable in industrial, logistics, and manufacturing applications, where the ability to “look inside the box without opening it” can greatly enhance the efficiency and security. However, the current state of the art in these applications relies on expensive custom-built large antenna array imaging scanners, coupled with image-based object detection algorithms, to detect and classify occluded or concealed objects. To address this challenge more efficiently, we propose a lightweight classification approach for detecting various occluded objects inside a cardboard box. We employ a standard off-the-shelf mmWave 4-D frequency-modulated continuous wave (FMCW) imaging radar. This is combined with a deep learning-based classification method in the form of a dual-stream convolutional neural network (CNN) approach to process complex in-phase and quadrature (IQ) radar signals. This approach reaches in our experiments an overall accuracy of 95.15% on average over a collection of ten different concealed objects.","PeriodicalId":100645,"journal":{"name":"IEEE Transactions on Radar Systems","volume":"3 ","pages":"789-798"},"PeriodicalIF":0.0000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11007063","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Radar Systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11007063/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The ability of millimeter-wave (mmWave) radar to penetrate lightweight materials and provide nonvisual insights into obscured areas represents a significant advantage over camera or LiDAR sensors. This capability enables mmWave radar to detect humans behind thin walls or identify occluded objects stored within luggage or packages. The latter capability is particularly valuable in industrial, logistics, and manufacturing applications, where the ability to “look inside the box without opening it” can greatly enhance the efficiency and security. However, the current state of the art in these applications relies on expensive custom-built large antenna array imaging scanners, coupled with image-based object detection algorithms, to detect and classify occluded or concealed objects. To address this challenge more efficiently, we propose a lightweight classification approach for detecting various occluded objects inside a cardboard box. We employ a standard off-the-shelf mmWave 4-D frequency-modulated continuous wave (FMCW) imaging radar. This is combined with a deep learning-based classification method in the form of a dual-stream convolutional neural network (CNN) approach to process complex in-phase and quadrature (IQ) radar signals. This approach reaches in our experiments an overall accuracy of 95.15% on average over a collection of ten different concealed objects.