A Survey of Multilingual Neural Machine Translation Based on Sparse Models

IF 3.5 1区 计算机科学 Q1 Multidisciplinary
Shaolin Zhu;Dong Jian;Deyi Xiong
{"title":"A Survey of Multilingual Neural Machine Translation Based on Sparse Models","authors":"Shaolin Zhu;Dong Jian;Deyi Xiong","doi":"10.26599/TST.2023.9010097","DOIUrl":null,"url":null,"abstract":"Recent research has shown a burgeoning interest in exploring sparse models for massively Multilingual Neural Machine Translation (MNMT). In this paper, we present a comprehensive survey of this emerging topic. Massively MNMT, when based on sparse models, offers significant improvements in parameter efficiency and reduces interference compared to its dense model counterparts. Various methods have been proposed to leverage sparse models for enhancing translation quality. However, the lack of a thorough survey has hindered the identification and further investigation of the most promising approaches. To address this gap, we provide an exhaustive examination of the current research landscape in massively MNMT, with a special emphasis on sparse models. Initially, we categorize the various sparse model-based approaches into distinct classifications. We then delve into each category in detail, elucidating their fundamental modeling principles, core issues, and the challenges they face. Wherever possible, we conduct comparative analyses to assess the strengths and weaknesses of different methodologies. Moreover, we explore potential future research avenues for MNMT based on sparse models. This survey serves as a valuable resource for both newcomers and established experts in the field of MNMT, particularly those interested in sparse model applications.","PeriodicalId":48690,"journal":{"name":"Tsinghua Science and Technology","volume":"30 6","pages":"2399-2418"},"PeriodicalIF":3.5000,"publicationDate":"2025-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11072061","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11072061/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Multidisciplinary","Score":null,"Total":0}
引用次数: 0

Abstract

Recent research has shown a burgeoning interest in exploring sparse models for massively Multilingual Neural Machine Translation (MNMT). In this paper, we present a comprehensive survey of this emerging topic. Massively MNMT, when based on sparse models, offers significant improvements in parameter efficiency and reduces interference compared to its dense model counterparts. Various methods have been proposed to leverage sparse models for enhancing translation quality. However, the lack of a thorough survey has hindered the identification and further investigation of the most promising approaches. To address this gap, we provide an exhaustive examination of the current research landscape in massively MNMT, with a special emphasis on sparse models. Initially, we categorize the various sparse model-based approaches into distinct classifications. We then delve into each category in detail, elucidating their fundamental modeling principles, core issues, and the challenges they face. Wherever possible, we conduct comparative analyses to assess the strengths and weaknesses of different methodologies. Moreover, we explore potential future research avenues for MNMT based on sparse models. This survey serves as a valuable resource for both newcomers and established experts in the field of MNMT, particularly those interested in sparse model applications.
基于稀疏模型的多语言神经机器翻译研究综述
近年来,研究人员对大规模多语言神经机器翻译(MNMT)的稀疏模型产生了浓厚的兴趣。在本文中,我们对这一新兴话题进行了全面的调查。与密集模型相比,基于稀疏模型的大规模MNMT在参数效率和减少干扰方面有显著提高。人们提出了各种方法来利用稀疏模型来提高翻译质量。然而,缺乏彻底的调查阻碍了确定和进一步调查最有希望的方法。为了解决这一差距,我们对大规模MNMT的当前研究现状进行了详尽的研究,特别强调了稀疏模型。首先,我们将各种基于稀疏模型的方法分为不同的类别。然后,我们详细研究每个类别,阐明它们的基本建模原则、核心问题以及它们面临的挑战。只要有可能,我们就会进行比较分析,以评估不同方法的优缺点。此外,我们还探索了基于稀疏模型的MNMT的潜在未来研究途径。这项调查为MNMT领域的新手和专家提供了宝贵的资源,特别是那些对稀疏模型应用感兴趣的人。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Tsinghua Science and Technology
Tsinghua Science and Technology COMPUTER SCIENCE, INFORMATION SYSTEMSCOMPU-COMPUTER SCIENCE, SOFTWARE ENGINEERING
CiteScore
10.20
自引率
10.60%
发文量
2340
期刊介绍: Tsinghua Science and Technology (Tsinghua Sci Technol) started publication in 1996. It is an international academic journal sponsored by Tsinghua University and is published bimonthly. This journal aims at presenting the up-to-date scientific achievements in computer science, electronic engineering, and other IT fields. Contributions all over the world are welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信