Design of a Modified Transformer Architecture Based on Relative Position Coding

IF 2.9 4区 计算机科学
Wenfeng Zheng, Gu Gong, Jiawei Tian, Siyu Lu, Ruiyang Wang, Zhengtong Yin, Xiaolu Li, Lirong Yin
{"title":"Design of a Modified Transformer Architecture Based on Relative Position Coding","authors":"Wenfeng Zheng, Gu Gong, Jiawei Tian, Siyu Lu, Ruiyang Wang, Zhengtong Yin, Xiaolu Li, Lirong Yin","doi":"10.1007/s44196-023-00345-z","DOIUrl":null,"url":null,"abstract":"Abstract Natural language processing (NLP) based on deep learning provides a positive performance for generative dialogue system, and the transformer model is a new boost in NLP after the advent of word vectors. In this paper, a Chinese generative dialogue system based on transformer is designed, which only uses a multi-layer transformer decoder to build the system and uses the design of an incomplete mask to realize one-way language generation. That is, questions can perceive context information in both directions, while reply sentences can only output one-way autoregressive. The above system improvements make the one-way generation of dialogue tasks more logical and reasonable, and the performance is better than the traditional dialogue system scheme. In consideration of the long-distance information weakness of absolute position coding, we put forward the improvement of relative position coding in theory, and verify it in subsequent experiments. In the transformer module, the calculation formula of self-attention is modified, and the relative position information is added to replace the absolute position coding of the position embedding layer. The performance of the modified model in BLEU, embedding average, grammatical and semantic coherence is ideal, to enhance long-distance attention.","PeriodicalId":54967,"journal":{"name":"International Journal of Computational Intelligence Systems","volume":"47 2","pages":"0"},"PeriodicalIF":2.9000,"publicationDate":"2023-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computational Intelligence Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s44196-023-00345-z","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Abstract Natural language processing (NLP) based on deep learning provides a positive performance for generative dialogue system, and the transformer model is a new boost in NLP after the advent of word vectors. In this paper, a Chinese generative dialogue system based on transformer is designed, which only uses a multi-layer transformer decoder to build the system and uses the design of an incomplete mask to realize one-way language generation. That is, questions can perceive context information in both directions, while reply sentences can only output one-way autoregressive. The above system improvements make the one-way generation of dialogue tasks more logical and reasonable, and the performance is better than the traditional dialogue system scheme. In consideration of the long-distance information weakness of absolute position coding, we put forward the improvement of relative position coding in theory, and verify it in subsequent experiments. In the transformer module, the calculation formula of self-attention is modified, and the relative position information is added to replace the absolute position coding of the position embedding layer. The performance of the modified model in BLEU, embedding average, grammatical and semantic coherence is ideal, to enhance long-distance attention.
一种基于相对位置编码的改进变压器结构设计
摘要基于深度学习的自然语言处理(NLP)为生成式对话系统提供了良好的性能,而变形模型是继词向量出现后对自然语言处理的一个新的推动。本文设计了一种基于变压器的中文生成对话系统,该系统仅使用多层变压器解码器来构建系统,并使用不完全掩码的设计来实现单向语言生成。即问题可以双向感知上下文信息,而回答句只能输出单向自回归。上述系统改进使对话任务的单向生成更加合乎逻辑、合理,性能优于传统的对话系统方案。针对绝对位置编码的远距离信息弱点,从理论上提出了相对位置编码的改进方案,并在后续实验中进行了验证。在变压器模块中,修改自关注计算公式,加入相对位置信息,取代位置嵌入层的绝对位置编码。改进后的模型在BLEU中表现良好,嵌入平均,语法和语义连贯,增强远程注意力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
International Journal of Computational Intelligence Systems
International Journal of Computational Intelligence Systems 工程技术-计算机:跨学科应用
自引率
3.40%
发文量
94
期刊介绍: The International Journal of Computational Intelligence Systems publishes original research on all aspects of applied computational intelligence, especially targeting papers demonstrating the use of techniques and methods originating from computational intelligence theory. The core theories of computational intelligence are fuzzy logic, neural networks, evolutionary computation and probabilistic reasoning. The journal publishes only articles related to the use of computational intelligence and broadly covers the following topics: -Autonomous reasoning- Bio-informatics- Cloud computing- Condition monitoring- Data science- Data mining- Data visualization- Decision support systems- Fault diagnosis- Intelligent information retrieval- Human-machine interaction and interfaces- Image processing- Internet and networks- Noise analysis- Pattern recognition- Prediction systems- Power (nuclear) safety systems- Process and system control- Real-time systems- Risk analysis and safety-related issues- Robotics- Signal and image processing- IoT and smart environments- Systems integration- System control- System modelling and optimization- Telecommunications- Time series prediction- Warning systems- Virtual reality- Web intelligence- Deep learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信