Toward Consistent and Informative Event-Event Temporal Relation Extraction

Xiaomeng Jin, Haoyang Wen, Xinya Du, Heng Ji
{"title":"Toward Consistent and Informative Event-Event Temporal Relation Extraction","authors":"Xiaomeng Jin, Haoyang Wen, Xinya Du, Heng Ji","doi":"10.18653/v1/2023.matching-1.3","DOIUrl":null,"url":null,"abstract":"Event-event temporal relation extraction aims to extract the temporal order between a pair of event mentions, which is usually used to construct temporal event graphs. However, event graphs generated by existing methods are usually globally inconsistent (event graphs containing cycles), semantically irrelevant (two unrelated events having temporal links), and context unaware (neglecting neighborhood information of an event node). In this paper, we propose a novel event-event temporal relation extraction method to address these limitations. Our model combines a pretrained language model and a graph neural network to output event embeddings, which captures the contextual information of event graphs. Moreover, to achieve global consistency and semantic relevance, (1) event temporal order should be in accordance with the norm of their embeddings, and (2) two events have temporal relation only if their embeddings are close enough. Experimental results on a real-world event dataset demonstrate that our method achieves state-of-the-art performance and generates high-quality event graphs.","PeriodicalId":107861,"journal":{"name":"Proceedings of the First Workshop on Matching From Unstructured and Structured Data (MATCHING 2023)","volume":"257 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the First Workshop on Matching From Unstructured and Structured Data (MATCHING 2023)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18653/v1/2023.matching-1.3","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Event-event temporal relation extraction aims to extract the temporal order between a pair of event mentions, which is usually used to construct temporal event graphs. However, event graphs generated by existing methods are usually globally inconsistent (event graphs containing cycles), semantically irrelevant (two unrelated events having temporal links), and context unaware (neglecting neighborhood information of an event node). In this paper, we propose a novel event-event temporal relation extraction method to address these limitations. Our model combines a pretrained language model and a graph neural network to output event embeddings, which captures the contextual information of event graphs. Moreover, to achieve global consistency and semantic relevance, (1) event temporal order should be in accordance with the norm of their embeddings, and (2) two events have temporal relation only if their embeddings are close enough. Experimental results on a real-world event dataset demonstrate that our method achieves state-of-the-art performance and generates high-quality event graphs.
面向一致性和信息性事件-事件时间关系提取
事件-事件时间关系提取的目的是提取一对事件提及之间的时间顺序,通常用于构建时间事件图。然而,现有方法生成的事件图通常是全局不一致的(事件图包含循环)、语义不相关的(两个不相关的事件具有时间链接)和上下文不相关的(忽略事件节点的邻域信息)。在本文中,我们提出了一种新的事件-事件时间关系提取方法来解决这些限制。我们的模型结合了一个预训练的语言模型和一个图神经网络来输出事件嵌入,它捕获事件图的上下文信息。此外,为了实现全局一致性和语义相关性,(1)事件时间顺序应符合其嵌入的规范;(2)两个事件只有在它们的嵌入足够接近时才具有时间关系。在真实事件数据集上的实验结果表明,我们的方法达到了最先进的性能,并生成了高质量的事件图。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信