Hang Xu;Yong Li;Qingran Dong;Li Liu;Jingxia Li;Jianguo Zhang;Bingjie Wang
{"title":"Random Code Radar With Range-Time–Frequency Points and Improved PointConv Network for Through-Wall Human Action Recognition","authors":"Hang Xu;Yong Li;Qingran Dong;Li Liu;Jingxia Li;Jianguo Zhang;Bingjie Wang","doi":"10.1109/JSEN.2025.3548121","DOIUrl":null,"url":null,"abstract":"We proposed and demonstrated experimentally a random code radar with range-time–frequency points and the improved PointConv network for through-wall human action recognition (HAR). The physical random code signal with the natural random and aperiodicity is used as the radar-transmitted waveform. A series of slow time-Doppler frequency (ST-DF) images are obtained by correlation ranging and short-time Fourier transform (STFT) for echo and reference signals, and then are arranged at different ranges to obtain the 3-D range-time–frequency matrix. The range-time–frequency points are input into the improved PointConv network after the constant false alarm rate (CFAR) detection, isosurface mesh generation (IMG), and farthest point sampling (FPS) for the 3-D matrix. The PointConv network is improved by model simplification and structural enhancement, which can achieve higher recognition accuracy, the smaller parameters with 5.53 M, and the smaller floating-point operations (FLOPs) of 1.06 G, compared to the existing PointConv network. Experimental results demonstrate that the proposed radar can accurately recognize human actions behind walls with a 99.63% average accuracy for ten actions and a 96.83% average accuracy for six participants. Compared with the 2-D image-based convolutional neural network, three-domain feature fusion, two 3-D point-based PointNet networks, and 3-D point-based PointConv network, the proposed method realizes the higher recognition accuracy.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 8","pages":"13719-13728"},"PeriodicalIF":4.3000,"publicationDate":"2025-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10923655/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
We proposed and demonstrated experimentally a random code radar with range-time–frequency points and the improved PointConv network for through-wall human action recognition (HAR). The physical random code signal with the natural random and aperiodicity is used as the radar-transmitted waveform. A series of slow time-Doppler frequency (ST-DF) images are obtained by correlation ranging and short-time Fourier transform (STFT) for echo and reference signals, and then are arranged at different ranges to obtain the 3-D range-time–frequency matrix. The range-time–frequency points are input into the improved PointConv network after the constant false alarm rate (CFAR) detection, isosurface mesh generation (IMG), and farthest point sampling (FPS) for the 3-D matrix. The PointConv network is improved by model simplification and structural enhancement, which can achieve higher recognition accuracy, the smaller parameters with 5.53 M, and the smaller floating-point operations (FLOPs) of 1.06 G, compared to the existing PointConv network. Experimental results demonstrate that the proposed radar can accurately recognize human actions behind walls with a 99.63% average accuracy for ten actions and a 96.83% average accuracy for six participants. Compared with the 2-D image-based convolutional neural network, three-domain feature fusion, two 3-D point-based PointNet networks, and 3-D point-based PointConv network, the proposed method realizes the higher recognition accuracy.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice