{"title":"A Survey of Multilingual Neural Machine Translation Based on Sparse Models","authors":"Shaolin Zhu;Dong Jian;Deyi Xiong","doi":"10.26599/TST.2023.9010097","DOIUrl":null,"url":null,"abstract":"Recent research has shown a burgeoning interest in exploring sparse models for massively Multilingual Neural Machine Translation (MNMT). In this paper, we present a comprehensive survey of this emerging topic. Massively MNMT, when based on sparse models, offers significant improvements in parameter efficiency and reduces interference compared to its dense model counterparts. Various methods have been proposed to leverage sparse models for enhancing translation quality. However, the lack of a thorough survey has hindered the identification and further investigation of the most promising approaches. To address this gap, we provide an exhaustive examination of the current research landscape in massively MNMT, with a special emphasis on sparse models. Initially, we categorize the various sparse model-based approaches into distinct classifications. We then delve into each category in detail, elucidating their fundamental modeling principles, core issues, and the challenges they face. Wherever possible, we conduct comparative analyses to assess the strengths and weaknesses of different methodologies. Moreover, we explore potential future research avenues for MNMT based on sparse models. This survey serves as a valuable resource for both newcomers and established experts in the field of MNMT, particularly those interested in sparse model applications.","PeriodicalId":48690,"journal":{"name":"Tsinghua Science and Technology","volume":"30 6","pages":"2399-2418"},"PeriodicalIF":3.5000,"publicationDate":"2025-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11072061","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11072061/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Multidisciplinary","Score":null,"Total":0}
引用次数: 0
Abstract
Recent research has shown a burgeoning interest in exploring sparse models for massively Multilingual Neural Machine Translation (MNMT). In this paper, we present a comprehensive survey of this emerging topic. Massively MNMT, when based on sparse models, offers significant improvements in parameter efficiency and reduces interference compared to its dense model counterparts. Various methods have been proposed to leverage sparse models for enhancing translation quality. However, the lack of a thorough survey has hindered the identification and further investigation of the most promising approaches. To address this gap, we provide an exhaustive examination of the current research landscape in massively MNMT, with a special emphasis on sparse models. Initially, we categorize the various sparse model-based approaches into distinct classifications. We then delve into each category in detail, elucidating their fundamental modeling principles, core issues, and the challenges they face. Wherever possible, we conduct comparative analyses to assess the strengths and weaknesses of different methodologies. Moreover, we explore potential future research avenues for MNMT based on sparse models. This survey serves as a valuable resource for both newcomers and established experts in the field of MNMT, particularly those interested in sparse model applications.
期刊介绍:
Tsinghua Science and Technology (Tsinghua Sci Technol) started publication in 1996. It is an international academic journal sponsored by Tsinghua University and is published bimonthly. This journal aims at presenting the up-to-date scientific achievements in computer science, electronic engineering, and other IT fields. Contributions all over the world are welcome.