Arabic and English Relative Clauses and Machine Translation Challenges

Khalil A. Nagi
{"title":"Arabic and English Relative Clauses and Machine Translation Challenges","authors":"Khalil A. Nagi","doi":"10.20428/jss.v29i3.2180","DOIUrl":null,"url":null,"abstract":"The study aims at performing an error analysis as well as providing an evaluation of the quality of neural machine translation (NMT) represented by Google Translate when translating relative clauses. The study uses two test suites are composed of sentences that contain relative clauses. The first test suite composes of 108 pair sentences that are translated from English to Arabic whereas the second composes of 72 Arabic sentences that are translated into English. Errors annotation is performed by 6 professional annotators. The study presents a list of the annotated errors divided into accuracy and fluency errors that occur based on MQM. Manual evaluation is also performed by the six professionals along with a BLEU automatic evaluation using the Tilde Me platform. The results show that fluency errors are more frequent than accuracy errors. They also show that the frequency of errors and MT quality when translating from English into Arabic is lower than the frequency of errors and MT quality when translating from Arabic into English is also presented. Based on the performed error analysis and both manual and automatic evaluation, it is pointed out that the gap between MT and professional human translation is still large.","PeriodicalId":53082,"journal":{"name":"mjl@ ldrst ljtm`y@","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"mjl@ ldrst ljtm`y@","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20428/jss.v29i3.2180","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The study aims at performing an error analysis as well as providing an evaluation of the quality of neural machine translation (NMT) represented by Google Translate when translating relative clauses. The study uses two test suites are composed of sentences that contain relative clauses. The first test suite composes of 108 pair sentences that are translated from English to Arabic whereas the second composes of 72 Arabic sentences that are translated into English. Errors annotation is performed by 6 professional annotators. The study presents a list of the annotated errors divided into accuracy and fluency errors that occur based on MQM. Manual evaluation is also performed by the six professionals along with a BLEU automatic evaluation using the Tilde Me platform. The results show that fluency errors are more frequent than accuracy errors. They also show that the frequency of errors and MT quality when translating from English into Arabic is lower than the frequency of errors and MT quality when translating from Arabic into English is also presented. Based on the performed error analysis and both manual and automatic evaluation, it is pointed out that the gap between MT and professional human translation is still large.
阿拉伯语和英语关系从句和机器翻译的挑战
本研究旨在对以Google翻译为代表的神经机器翻译(NMT)在翻译关系从句时的误差进行分析,并对其质量进行评价。该研究使用了两个测试套件,由包含关系从句的句子组成。第一个测试套件由108对句子组成,这些句子从英语翻译成阿拉伯语,而第二个测试套件由72个阿拉伯语句子组成,这些句子被翻译成英语。错误注释由6名专业注释人员执行。该研究给出了一个基于MQM的标注错误列表,分为准确性和流畅性错误。手动评估也由六名专业人员使用Tilde Me平台进行BLEU自动评估。结果表明,流利性错误比准确性错误更常见。结果还表明,从英语翻译成阿拉伯语时的错误频率和翻译质量低于从阿拉伯语翻译成英语时的错误频率和翻译质量。通过对机器翻译的误差分析以及人工和自动评价,指出机器翻译与专业人工翻译的差距仍然很大。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
23
审稿时长
8 weeks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信