应用自然语言处理技术探索英语长句翻译方法

IF 3.1 Q1 Mathematics
Fengmei Shang, You Li
{"title":"应用自然语言处理技术探索英语长句翻译方法","authors":"Fengmei Shang, You Li","doi":"10.2478/amns.2023.2.01352","DOIUrl":null,"url":null,"abstract":"Abstract This paper analyzes the “encoder-decoder” framework in neural machine translation and clarifies that the task of natural language processing is sequence learning. Secondly, recurrent neural networks are used to combine the historical hidden layer output information with the current input information, which is specialized in processing sequence data to achieve good translation results. Applying the attention mechanism to the field of natural language processing, a Transformer model based on the full attention mechanism is constructed in order to achieve the purpose of translating the source language while also performing alignment operations on the target language. The evaluation and analysis of the Transformer model based on the full-attention mechanism concludes that the Transformer model has 0.0152 Pearson correlation coefficients higher than the Bilingual Expert model, which is also 2.92% higher than the Bilingual Expert model, with the participation of f feature in both models. This further proves the Transformer model’s ability to correctly and effectively translate English sentences. At the same time, it also shows that the application of natural language processing technology can improve the efficiency of English long-sentence translation and comprehensively improve the quality of long-sentence translation.","PeriodicalId":52342,"journal":{"name":"Applied Mathematics and Nonlinear Sciences","volume":"11 9","pages":""},"PeriodicalIF":3.1000,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Exploring English Long Sentence Translation Methods by Applying Natural Language Processing Techniques\",\"authors\":\"Fengmei Shang, You Li\",\"doi\":\"10.2478/amns.2023.2.01352\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract This paper analyzes the “encoder-decoder” framework in neural machine translation and clarifies that the task of natural language processing is sequence learning. Secondly, recurrent neural networks are used to combine the historical hidden layer output information with the current input information, which is specialized in processing sequence data to achieve good translation results. Applying the attention mechanism to the field of natural language processing, a Transformer model based on the full attention mechanism is constructed in order to achieve the purpose of translating the source language while also performing alignment operations on the target language. The evaluation and analysis of the Transformer model based on the full-attention mechanism concludes that the Transformer model has 0.0152 Pearson correlation coefficients higher than the Bilingual Expert model, which is also 2.92% higher than the Bilingual Expert model, with the participation of f feature in both models. This further proves the Transformer model’s ability to correctly and effectively translate English sentences. At the same time, it also shows that the application of natural language processing technology can improve the efficiency of English long-sentence translation and comprehensively improve the quality of long-sentence translation.\",\"PeriodicalId\":52342,\"journal\":{\"name\":\"Applied Mathematics and Nonlinear Sciences\",\"volume\":\"11 9\",\"pages\":\"\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2023-12-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Mathematics and Nonlinear Sciences\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2478/amns.2023.2.01352\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Mathematics\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Mathematics and Nonlinear Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2478/amns.2023.2.01352","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0

摘要

摘要分析了神经机器翻译中的“编码器-解码器”框架,阐明了自然语言处理的任务是序列学习。其次,利用递归神经网络将历史隐层输出信息与当前输入信息相结合,专门处理序列数据,获得较好的翻译效果;将注意机制应用于自然语言处理领域,构造了一个基于全注意机制的Transformer模型,以达到翻译源语言的同时对目标语言进行对齐操作的目的。基于全注意机制的Transformer模型的评价分析表明,Transformer模型的Pearson相关系数比双语专家模型高0.0152,比双语专家模型高2.92%,两个模型都有f特征参与。这进一步证明了Transformer模型正确有效地翻译英语句子的能力。同时也说明了自然语言处理技术的应用可以提高英语长句翻译的效率,全面提高长句翻译的质量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Exploring English Long Sentence Translation Methods by Applying Natural Language Processing Techniques
Abstract This paper analyzes the “encoder-decoder” framework in neural machine translation and clarifies that the task of natural language processing is sequence learning. Secondly, recurrent neural networks are used to combine the historical hidden layer output information with the current input information, which is specialized in processing sequence data to achieve good translation results. Applying the attention mechanism to the field of natural language processing, a Transformer model based on the full attention mechanism is constructed in order to achieve the purpose of translating the source language while also performing alignment operations on the target language. The evaluation and analysis of the Transformer model based on the full-attention mechanism concludes that the Transformer model has 0.0152 Pearson correlation coefficients higher than the Bilingual Expert model, which is also 2.92% higher than the Bilingual Expert model, with the participation of f feature in both models. This further proves the Transformer model’s ability to correctly and effectively translate English sentences. At the same time, it also shows that the application of natural language processing technology can improve the efficiency of English long-sentence translation and comprehensively improve the quality of long-sentence translation.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Applied Mathematics and Nonlinear Sciences
Applied Mathematics and Nonlinear Sciences Engineering-Engineering (miscellaneous)
CiteScore
2.90
自引率
25.80%
发文量
203
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信