Yafen Dong, Xiaohong Shen, Yongsheng Yan, Haiyan Wang
{"title":"Small-scale Data Underwater Acoustic Target Recognition with Deep Forest Model","authors":"Yafen Dong, Xiaohong Shen, Yongsheng Yan, Haiyan Wang","doi":"10.1109/ICSPCC55723.2022.9984335","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984335","url":null,"abstract":"Underwater acoustic target recognition is an issue of great interest, and its key lies in effective feature extraction. Nowadays, due to the rapid development of underwater acoustic signal processing technology and machine learning, some progress has been made in the field of underwater acoustic target recognition. However, traditional machine learning methods utilize shallow features, and the recognition ability needs to be further improved. Although neural network-based deep learning methods can extract deep features, they are prone to over-fitting and other undesirable phenomena in underwater small-scale data scenarios. This means that we need to find a method of underwater acoustic target recognition that can extract deep features, and it should be suitable for small-scale data scenarios. In this research, a method of underwater acoustic target recognition based on the deep forest model is come up with to meet the above requirements. This method adopts MFCC features and the deep forest model as the input feature vectors and classifier, respectively. Experimental results on the ShipsEar database show that the proposed method achieves satisfactory performance and has a promising application in the field of small-scale data underwater acoustic target recognition.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116434581","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"DAC-Sync: Research on underwater time synchronization algorithm based on Doppler effect and clustering model","authors":"Yunfeng Han, Ziyi Guo, Yujie Ouyang, Jucheng Zhang","doi":"10.1109/ICSPCC55723.2022.9984584","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984584","url":null,"abstract":"Time synchronization technology plays an important role in underwater acoustic sensor networks, which is the foundation of the cooperation among network nodes to complete distributed tasks. To address the problem of long underwater propagation delay, the mobility of nodes, and the limited energy of nodes, this paper proposes an underwater time synchronization algorithm (DAC-Sync) based on the Doppler effect and the cluster model. The clustering model is used to realize the time synchronization between nodes in stages. Besides, the effect of clock frequency skew is considered when estimating the Doppler scale factor, the clock frequency skew is solved by multiple one-way interactions and obtain clock phase offset through a two-way interaction, to complete the whole time synchronization process. The performance of DAC-Sync was compared with CD-Sync under the same simulation conditions. The results show that, for punctuality accuracy, the algorithm drifts by an average of around 36 microseconds per second over a period of 106 seconds, while CD-Sync drifts by an average of around 146 microseconds per second, while, drifting 1.5 milliseconds and 6.1 milliseconds respectively for timing accuracy. For energy consumption, compared with CD-Sync, DAC-Sync has lower computing cost and simpler synchronization process. Therefore, this algorithm has higher energy efficiency and high synchronization accuracy.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126260595","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tian Ma, Chenhui Fu, Ming Guo, Jiayi Yang, Jia Liu
{"title":"Dual attention unit-based generative adversarial networks for low-light image enhancement","authors":"Tian Ma, Chenhui Fu, Ming Guo, Jiayi Yang, Jia Liu","doi":"10.1109/ICSPCC55723.2022.9984400","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984400","url":null,"abstract":"Images taken in low-light conditions would have insufficient light intensity and high noise. Many existing methods could not work very well in low-light environments, such as the noise and artifacts in dark conditions will be more obvious when enhanced. Therefore, low-light image enhancement is a challenging task in computer vision. To solve this problem, this paper proposes a lightweight generative adversarial network with dual-attention units to enhance underexposed photos. There is only a simple two-layer convolution in the generator section, and a dual-attention unit is added between the two convolutions to suppress the noise generated during the enhancement process and the deviation of color reduction. Then, non-local correlations of the image are used in the spatial attention module for denoising. Ours low-light image enhancement network is guided by the channel attention module to optimize redundant color features. In addition, the ideas of PatchGAN and Relativistic GAN are combined in the discriminator section to make the discriminator a better measure of the probability of changing from absolute true or false to relative true or false. The experiment results show that, our method could get better enhancement effects on low-illumination image datasets, which has more natural color, better exposure, and less noise and artifacts.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"125 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126700644","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Specific Emitter Identification Based on Homomorphic Filtering and Support Vector Machine-2K","authors":"Qi Wu, Zepeng Hu, Q. Wan","doi":"10.1109/ICSPCC55723.2022.9984423","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984423","url":null,"abstract":"Specific emitter identification (SEI) is the process of identifying or discriminating different emitters by extracting the radio frequency fingerprints from the received signals. A novel SEI scheme with two steps is proposed in this paper. In the first step, the new fingerprint features are extracted as the emitter-irrelated information is suppressed by homomorphic filtering. Then, Two View SVM-2K (Support Vector Machine on two Kernels) classifier is exploited to classify emitters effectively based on the above features. Simulation results show that the proposed method achieved better classification performance than the benchmark method.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"2013 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128132088","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mine pressure prediction model of fully mechanized mining face based on Improved Transformer","authors":"Yaping Liu, Lihong Dong, Ou Ye","doi":"10.1109/ICSPCC55723.2022.9984378","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984378","url":null,"abstract":"With the increase of mining depth, the frequency of mine pressure disasters on the comprehensive mining face also increases, which has a significant impact on the safety production of coal mines, so the accurate prediction of mine pressure on the comprehensive mining face is of great significance to the prevention of coal mine disasters. In order to improve the prediction accuracy of mine pressure, an improved Transformer mine pressure prediction model is proposed in this paper. Firstly, the gray correlation is used to analyze and rank the mine pressure monitoring data of multiple supports at the working face; secondly, the trend-seasonality decomposition method is combined with Transformer to build the improved Transformer prediction model and optimize it with the optimization algorithm to realize the prediction of mine pressure at the comprehensive mining working face. The Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE) are used to evaluate the prediction effect of the model. The experimental results show that the prediction result of the improved Transformer model is better than the traditional BP neural network, GRU, LSTM and the basic Transformer model, and has higher accuracy.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130579661","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Impulsive Noise Suppression and Concatenated Code for OFDM Underwater Acoustic Communications","authors":"Gang Tan, Shefeng Yan, Binbin Yang","doi":"10.1109/ICSPCC55723.2022.9984322","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984322","url":null,"abstract":"Underwater acoustic (UWA) information transmission is vulnerable to the interference of impulsive noise, which has short duration, high energy and random occurrence, decreasing the reliability of the orthogonal frequency-division multiplexing (OFDM) systems seriously. In this paper, we introduce the symmetric alpha stable (SαS) distribution to model the underwater impulsive noise firstly, and verifies the accuracy by measured noise data. For strong impulsive interference, an adaptive window median filter algorithm based on Chebyshev inequality (AWMF-C) is proposed, which applies the Chebyshev inequality twice to filter out the impulses and designs an adaptive window median filter to suppress them. Secondly, the serial concatenated code with RS code as outer code and interleaver are applied to suppress the residual impulsive interference further. Simulation results show that the proposed algorithm can suppress impulsive noise effectively, and the concatenated coding system given in this paper can further reduce the bit error rate (BER) of the communications system under impulsive noise background.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132127523","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lin Sun, Xiaohong Shen, Zhengguo Liu, Hong Wang, Yifan Yuan, Haodi Mei
{"title":"Research on the Existence of Non-cooperative Underwater Acoustic Monitoring Network","authors":"Lin Sun, Xiaohong Shen, Zhengguo Liu, Hong Wang, Yifan Yuan, Haodi Mei","doi":"10.1109/ICSPCC55723.2022.9984334","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984334","url":null,"abstract":"The existence of Non-cooperative Underwater Acoustic Monitoring Networks(UAMNs) is a key research problem in the field of network countermeasures. Since it is difficult for our sensing nodes to directly obtain the networking situation of wireless sensor nodes in non-cooperative regions, it is a worthy research direction to judge whether they form a network by using the behavioral characteristics of each non-cooperative node. In this paper, the connectivity rate between two nodes is obtained through the information interaction between nodes and the shared link. Based on it, we use graph theory to determine the connectivity and relevance of nodes in the region. So the probability of the Underwater Acoustic Monitoring Network’s existence in a non-cooperative sensing region we can determined. Finally, a simulation experiment was carried out to verify the accuracy of the theory.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131541147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jinfeng He, Hongtu Xie, Xinqiao Jiang, Zhitao Wu, Guoqian Wang
{"title":"Ship Recognition Algorithm Based on ResNet in SAR Images","authors":"Jinfeng He, Hongtu Xie, Xinqiao Jiang, Zhitao Wu, Guoqian Wang","doi":"10.1109/ICSPCC55723.2022.9984594","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984594","url":null,"abstract":"In the application fields of ocean target recognition in remote sensing images, the target classification of the marine ships based on synthetic aperture radar (SAR) figures remains a significant challenge. The traditional ship target recognition algorithms rely on manually selected features, and these features need to be designed on many experimental bases and professional domain knowledge, which leads to the poor robustness of the algorithm and poor recognition results. In this paper, in order to solve the problem of ship recognition in the SAR image without using manually selected features, a method based on the ResNet is proposed. First, a data augmentation module has been used to expand the experimental dataset. Then, the ResNet is used to recognize the ship in the SAR figures. Ultimately, the experiments based on the ship SAR dataset are carried out, and the suggested recognition method is verified to be of great effectiveness and applicability.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133885337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Active Jamming Signal Recognition based on Residual Neural Network*","authors":"Mingqiu Ren, B. Cheng, Po Gao","doi":"10.1109/ICSPCC55723.2022.9984424","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984424","url":null,"abstract":"The problem how to effectively identify the type of active jamming signal has important practical significance for the accurate perception of radar anti-jamming system. Therefore, a radar active jamming identification method based on fractional Fourier transform and residual neural network is proposed. Before the jamming signal pattern recognition, the time-frequency structure model of the signal is established, and the influence factors such as radar technical system and jamming signal processing cycle are comprehensively considered. The constraints of the time-frequency analysis kernel function and processing on the jamming signal type, the cross term of the composite modulation signal and the effectiveness of the distorted signal characteristics are analyzed. Then, according to the requirements of availability and recognition rate of subsequent signal classifiers, the recognition model based on residual neural network (RESNET) is used to solve the problem. The simulation results show that the recognition effect of multiple active jamming patterns under different interference to signal ratio is higher than 90%, which verifies the effectiveness and rationality of the method.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115077988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Multi-Temporal PolSAR Image Classification Based on Polarimetric Scattering Tensor Eigenvalue Decomposition and Deep CNN Model","authors":"Jun-Wu Deng, Haoliang Li, X. Cui, Siwei Chen","doi":"10.1109/ICSPCC55723.2022.9984546","DOIUrl":"https://doi.org/10.1109/ICSPCC55723.2022.9984546","url":null,"abstract":"Multi-temporal polarimetric synthetic aperture radar (PolSAR) image is an important tool to monitor crops growth and evaluate disaster damage. The multi-temporal PolSAR data has the high dimensional representation. Benefited from the tensor analysis, a three dimensional polarimetric scattering tensor is established. The polarimetric scattering tensor eigenvalue decomposition is proposed to derive the polarimetric features, which are polarimetric tensor entropy, polarimetric tensor alpha angle and polarimetric tensor anisotropy, respectively. Multi-temporal PolSAR image classification is applied to validate the effectiveness of the proposed features. To further improve the classification accuracy, the 1 × 1 convolutional kernel is introduced to learn the inter-temporal information. For the multi-temporal UAVSAR datasets, the proposed method achieves the excellent classification accuracy in the multi-temporal PolSAR image classification.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114032551","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}