Den-ML: Multi-source cross-lingual transfer via denoising mutual learning

IF 7.4 1区 管理学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Ling Ge , Chunming Hu , Guanghui Ma , Hong Zhang , Jihong Liu
{"title":"Den-ML: Multi-source cross-lingual transfer via denoising mutual learning","authors":"Ling Ge ,&nbsp;Chunming Hu ,&nbsp;Guanghui Ma ,&nbsp;Hong Zhang ,&nbsp;Jihong Liu","doi":"10.1016/j.ipm.2024.103834","DOIUrl":null,"url":null,"abstract":"<div><p>Multi-source cross-lingual transfer aims to acquire task knowledge from multiple labelled source languages and transfer it to an unlabelled target language, enabling effective performance in this target language. The existing methods mainly focus on weighting predictions of language-specific classifiers trained in source languages to derive final results for target samples. However, we argue that, due to the language gap, language-specific classifiers inevitably generate many noisy predictions for target samples. Furthermore, these methods disregard the mutual guidance and utilization of knowledge among multiple source languages. To address these issues, we propose a novel model, Den-ML, which improves the model’s performance in multi-source scenarios through two perspectives: reducing prediction noise of language-specific classifiers and prompting mutual learning among these classifiers. Firstly, Den-ML devises a discrepancy-guided denoising learning method to learn discriminative representations for the target language, thus mitigating the noise prediction of classifiers. Secondly, Den-ML develops a pseudo-label-supervised mutual learning method, which relies on forcing probability distribution interactions among multiple language-specific classifiers for knowledge transfer, thus achieving mutual learning among classifiers. We conduct experiments on three different tasks, named entity recognition, paraphrase identification and natural language inference, with two different multi-source combination settings (same- and different-family settings) covering 39 languages. Our approach outperforms the benchmark and the SOTA model in most metrics for all three tasks in different settings. In addition, we perform ablation, visualization and analysis experiments on three different tasks, and the experimental results validate the effectiveness of our approach.</p></div>","PeriodicalId":50365,"journal":{"name":"Information Processing & Management","volume":null,"pages":null},"PeriodicalIF":7.4000,"publicationDate":"2024-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing & Management","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306457324001936","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Multi-source cross-lingual transfer aims to acquire task knowledge from multiple labelled source languages and transfer it to an unlabelled target language, enabling effective performance in this target language. The existing methods mainly focus on weighting predictions of language-specific classifiers trained in source languages to derive final results for target samples. However, we argue that, due to the language gap, language-specific classifiers inevitably generate many noisy predictions for target samples. Furthermore, these methods disregard the mutual guidance and utilization of knowledge among multiple source languages. To address these issues, we propose a novel model, Den-ML, which improves the model’s performance in multi-source scenarios through two perspectives: reducing prediction noise of language-specific classifiers and prompting mutual learning among these classifiers. Firstly, Den-ML devises a discrepancy-guided denoising learning method to learn discriminative representations for the target language, thus mitigating the noise prediction of classifiers. Secondly, Den-ML develops a pseudo-label-supervised mutual learning method, which relies on forcing probability distribution interactions among multiple language-specific classifiers for knowledge transfer, thus achieving mutual learning among classifiers. We conduct experiments on three different tasks, named entity recognition, paraphrase identification and natural language inference, with two different multi-source combination settings (same- and different-family settings) covering 39 languages. Our approach outperforms the benchmark and the SOTA model in most metrics for all three tasks in different settings. In addition, we perform ablation, visualization and analysis experiments on three different tasks, and the experimental results validate the effectiveness of our approach.

Den-ML:通过去噪相互学习实现多源跨语言传输
多源跨语言迁移旨在从多种标注源语言中获取任务知识,并将其迁移到未标注的目标语言中,从而在该目标语言中实现有效的性能。现有的方法主要侧重于对在源语言中训练的特定语言分类器的预测进行加权,从而得出目标样本的最终结果。然而,我们认为,由于语言差距,特定语言分类器不可避免地会对目标样本产生许多噪声预测。此外,这些方法还忽视了多种源语言之间的相互引导和知识利用。为了解决这些问题,我们提出了一个新颖的模型 Den-ML,它从两个方面提高了模型在多源场景中的性能:减少特定语言分类器的预测噪声和促进这些分类器之间的相互学习。首先,Den-ML 设计了一种差异引导的去噪学习方法,以学习目标语言的鉴别表征,从而降低分类器的预测噪声。其次,Den-ML 开发了一种伪标签监督下的相互学习方法,该方法依赖于多个特定语言分类器之间的强制概率分布交互来进行知识转移,从而实现分类器之间的相互学习。我们在命名实体识别、转述识别和自然语言推理这三个不同的任务上进行了实验,采用了两种不同的多源组合设置(同族和异族设置),涵盖 39 种语言。在不同设置下的所有三个任务中,我们的方法在大多数指标上都优于基准和 SOTA 模型。此外,我们还在三个不同任务中进行了消融、可视化和分析实验,实验结果验证了我们方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Information Processing & Management
Information Processing & Management 工程技术-计算机:信息系统
CiteScore
17.00
自引率
11.60%
发文量
276
审稿时长
39 days
期刊介绍: Information Processing and Management is dedicated to publishing cutting-edge original research at the convergence of computing and information science. Our scope encompasses theory, methods, and applications across various domains, including advertising, business, health, information science, information technology marketing, and social computing. We aim to cater to the interests of both primary researchers and practitioners by offering an effective platform for the timely dissemination of advanced and topical issues in this interdisciplinary field. The journal places particular emphasis on original research articles, research survey articles, research method articles, and articles addressing critical applications of research. Join us in advancing knowledge and innovation at the intersection of computing and information science.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信