AssocKD: An Association-Aware Knowledge Distillation Method for Document-Level Event Argument Extraction

IF 2.3 3区 数学 Q1 MATHEMATICS
Mathematics Pub Date : 2024-09-18 DOI:10.3390/math12182901
Lijun Tan, Yanli Hu, Jianwei Cao, Zhen Tan
{"title":"AssocKD: An Association-Aware Knowledge Distillation Method for Document-Level Event Argument Extraction","authors":"Lijun Tan, Yanli Hu, Jianwei Cao, Zhen Tan","doi":"10.3390/math12182901","DOIUrl":null,"url":null,"abstract":"Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without considering the association among events; this is known as document-level single-event extraction. However, the interrelationship among arguments can yield mutual gains in their extraction. Therefore, in this paper, we propose AssocKD, an Association-aware Knowledge Distillation Method for Document-level Event Argument Extraction, which enables the enhancement of document-level multi-event extraction with event association knowledge. Firstly, we introduce an association-aware training task to extract unknown arguments with the given privileged knowledge of relevant arguments, obtaining an association-aware model that can construct both intra-event and inter-event relationships. Secondly, we adopt multi-teacher knowledge distillation to transfer such event association knowledge from the association-aware teacher models to the event argument extraction student model. Our proposed method, AssocKD, is capable of explicitly modeling and efficiently leveraging event association to enhance the extraction of multi-event arguments at the document level. We conduct experiments on RAMS and WIKIEVENTS datasets and observe a significant improvement, thus demonstrating the effectiveness of our method.","PeriodicalId":18303,"journal":{"name":"Mathematics","volume":"12 1","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mathematics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.3390/math12182901","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

Abstract

Event argument extraction is a crucial subtask of event extraction, which aims at extracting arguments that correspond to argument roles when given event types. The majority of current document-level event argument extraction works focus on extracting information for only one event at a time without considering the association among events; this is known as document-level single-event extraction. However, the interrelationship among arguments can yield mutual gains in their extraction. Therefore, in this paper, we propose AssocKD, an Association-aware Knowledge Distillation Method for Document-level Event Argument Extraction, which enables the enhancement of document-level multi-event extraction with event association knowledge. Firstly, we introduce an association-aware training task to extract unknown arguments with the given privileged knowledge of relevant arguments, obtaining an association-aware model that can construct both intra-event and inter-event relationships. Secondly, we adopt multi-teacher knowledge distillation to transfer such event association knowledge from the association-aware teacher models to the event argument extraction student model. Our proposed method, AssocKD, is capable of explicitly modeling and efficiently leveraging event association to enhance the extraction of multi-event arguments at the document level. We conduct experiments on RAMS and WIKIEVENTS datasets and observe a significant improvement, thus demonstrating the effectiveness of our method.
AssocKD:用于文档级事件论据提取的关联意识知识提炼方法
事件参数提取是事件提取的一个重要子任务,其目的是在给定事件类型时提取与参数角色相对应的参数。目前,大多数文档级事件论据提取工作都侧重于一次只提取一个事件的信息,而不考虑事件之间的关联,这被称为文档级单一事件提取。然而,论据之间的相互关系可以在提取中产生互利。因此,我们在本文中提出了一种用于文档级事件论据提取的关联感知知识提炼方法--AssociatedKD,该方法能够利用事件关联知识增强文档级多事件提取。首先,我们引入了关联感知训练任务,利用给定的相关论据特权知识提取未知论据,得到了一个能构建事件内关系和事件间关系的关联感知模型。其次,我们采用多教师知识提炼法,将这些事件关联知识从关联感知教师模型转移到事件论据提取学生模型中。我们提出的方法--AssocKD--能够明确地对事件关联进行建模并有效地利用事件关联,从而提高文档级的多事件论据提取能力。我们在 RAMS 和 WIKIEVENTS 数据集上进行了实验,观察到了显著的改进,从而证明了我们方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Mathematics
Mathematics Mathematics-General Mathematics
CiteScore
4.00
自引率
16.70%
发文量
4032
审稿时长
21.9 days
期刊介绍: Mathematics (ISSN 2227-7390) is an international, open access journal which provides an advanced forum for studies related to mathematical sciences. It devotes exclusively to the publication of high-quality reviews, regular research papers and short communications in all areas of pure and applied mathematics. Mathematics also publishes timely and thorough survey articles on current trends, new theoretical techniques, novel ideas and new mathematical tools in different branches of mathematics.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信