{"title":"Dynamic Convolution and Transformer Based Dual-Branch Coding in Semantic Communication System","authors":"Jiaxin Yang;Zhiquan Bai;Xiaodong Xu;Xuchao Teng;Mengying Sun;KyungSup Kwak","doi":"10.1109/LCOMM.2025.3557421","DOIUrl":null,"url":null,"abstract":"Semantic communication is a potential key technology in 6G intelligent communication era. To reduce the information redundancy and improve the accuracy of text data transmission, this letter proposes a novel dual-branch joint source-channel coding model in semantic communication system that integrates dynamic convolution (DynConv) with Transformer architecture. The system utilizes the convolutional attention mechanism to capture the local semantic details and the Transformer self-attention mechanism for contextual associations within sentences. In particular, we further investigate the dimension allocation strategy between these two attention mechanisms, seeking the optimal balance between the recovery performance and complexity of the system. Simulation results demonstrate that, compared with the traditional communication system and the typical semantic communication system based on the standard Transformer, the proposed system achieves higher recovery performance, better robustness, and lower complexity.","PeriodicalId":13197,"journal":{"name":"IEEE Communications Letters","volume":"29 5","pages":"1161-1165"},"PeriodicalIF":3.7000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Communications Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10948396/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Semantic communication is a potential key technology in 6G intelligent communication era. To reduce the information redundancy and improve the accuracy of text data transmission, this letter proposes a novel dual-branch joint source-channel coding model in semantic communication system that integrates dynamic convolution (DynConv) with Transformer architecture. The system utilizes the convolutional attention mechanism to capture the local semantic details and the Transformer self-attention mechanism for contextual associations within sentences. In particular, we further investigate the dimension allocation strategy between these two attention mechanisms, seeking the optimal balance between the recovery performance and complexity of the system. Simulation results demonstrate that, compared with the traditional communication system and the typical semantic communication system based on the standard Transformer, the proposed system achieves higher recovery performance, better robustness, and lower complexity.
期刊介绍:
The IEEE Communications Letters publishes short papers in a rapid publication cycle on advances in the state-of-the-art of communication over different media and channels including wire, underground, waveguide, optical fiber, and storage channels. Both theoretical contributions (including new techniques, concepts, and analyses) and practical contributions (including system experiments and prototypes, and new applications) are encouraged. This journal focuses on the physical layer and the link layer of communication systems.