Reducing Spurious Correlations for Relation Extraction by Feature Decomposition and Semantic Augmentation

Tianshu Yu, Min Yang, Chengming Li, Ruifeng Xu
{"title":"Reducing Spurious Correlations for Relation Extraction by Feature Decomposition and Semantic Augmentation","authors":"Tianshu Yu, Min Yang, Chengming Li, Ruifeng Xu","doi":"10.1145/3539618.3592050","DOIUrl":null,"url":null,"abstract":"Deep neural models have become mainstream in relation extraction (RE), yielding state-of-the-art performance. However, most existing neural models are prone to spurious correlations between input features and prediction labels, making the models suffer from low robustness and generalization.In this paper, we propose a spurious correlation reduction method for RE via feature decomposition and semantic augmentation (denoted as FDSA). First, we decompose the original sentence representation into class-related features and context-related features. To obtain better context-related features, we devise a contrastive learning method to pull together the context-related features of the anchor sentence and its augmented sentences, and push away the context-related features of different anchor sentences. In addition, we propose gradient-based semantic augmentation on context-related features in order to improve the robustness of the RE model. Experiments on four datasets show that our model outperforms the strong competitors.","PeriodicalId":425056,"journal":{"name":"Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3539618.3592050","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Deep neural models have become mainstream in relation extraction (RE), yielding state-of-the-art performance. However, most existing neural models are prone to spurious correlations between input features and prediction labels, making the models suffer from low robustness and generalization.In this paper, we propose a spurious correlation reduction method for RE via feature decomposition and semantic augmentation (denoted as FDSA). First, we decompose the original sentence representation into class-related features and context-related features. To obtain better context-related features, we devise a contrastive learning method to pull together the context-related features of the anchor sentence and its augmented sentences, and push away the context-related features of different anchor sentences. In addition, we propose gradient-based semantic augmentation on context-related features in order to improve the robustness of the RE model. Experiments on four datasets show that our model outperforms the strong competitors.
基于特征分解和语义增强的关系提取中的虚假关联
深度神经模型已成为关系提取(RE)的主流,具有最先进的性能。然而,现有的神经模型大多存在输入特征与预测标签之间存在虚假关联的问题,导致模型鲁棒性和泛化程度较低。在本文中,我们提出了一种基于特征分解和语义增强(FDSA)的伪相关降低方法。首先,我们将原始句子表示分解为类相关特征和上下文相关特征。为了获得更好的语境相关特征,我们设计了一种对比学习方法,将锚句与其扩充句的语境相关特征拉到一起,将不同锚句的语境相关特征推离。此外,我们提出了基于梯度的上下文相关特征语义增强,以提高正则模型的鲁棒性。在四个数据集上的实验表明,我们的模型优于强大的竞争对手。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信