基于交互注意和对比学习的小镜头关系提取

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yan Li , Yao Wang , Zhaojie Wang , Wei Wang , Bailing Wang , Guodong Xin
{"title":"基于交互注意和对比学习的小镜头关系提取","authors":"Yan Li ,&nbsp;Yao Wang ,&nbsp;Zhaojie Wang ,&nbsp;Wei Wang ,&nbsp;Bailing Wang ,&nbsp;Guodong Xin","doi":"10.1016/j.neucom.2025.131551","DOIUrl":null,"url":null,"abstract":"<div><div>Relation extraction is a critical task in natural language processing, often challenged by the problem of insufficient samples in real world scenarios. Therefore, studying few-shot relation extraction is of great significance. Currently, prototype networks and meta-learning-based parameter optimization are the mainstream methods to study this kind of problem. However, these methods still face sample confusion during classification, and the trained models are prone to overfitting. To solve these problems, this paper proposes a few-shot relation extraction method based on interactive attention. During the model training stage, we introduce two contrastive learning approaches to better capture sample features and reduce sample confusion. Contrastive learning strengthens the connections between instances and their corresponding relationship descriptions, thus improving relation extraction. In the testing phase, the model employs an attention mechanism to calculate the attention scores between the query set and the support set and employs a new classification layer to mitigate overfitting. We conducted experiments on two real-world few-shot relation extraction datasets, and the results demonstrate that our method achieved superior performance on both in-domain and cross-domain datasets, proving the effectiveness of the proposed approach. The code is available at <span><span>https://github.com/xyzew/IACL.git</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"658 ","pages":"Article 131551"},"PeriodicalIF":6.5000,"publicationDate":"2025-09-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Interactive attention and contrastive learning for few-shot relation extraction\",\"authors\":\"Yan Li ,&nbsp;Yao Wang ,&nbsp;Zhaojie Wang ,&nbsp;Wei Wang ,&nbsp;Bailing Wang ,&nbsp;Guodong Xin\",\"doi\":\"10.1016/j.neucom.2025.131551\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Relation extraction is a critical task in natural language processing, often challenged by the problem of insufficient samples in real world scenarios. Therefore, studying few-shot relation extraction is of great significance. Currently, prototype networks and meta-learning-based parameter optimization are the mainstream methods to study this kind of problem. However, these methods still face sample confusion during classification, and the trained models are prone to overfitting. To solve these problems, this paper proposes a few-shot relation extraction method based on interactive attention. During the model training stage, we introduce two contrastive learning approaches to better capture sample features and reduce sample confusion. Contrastive learning strengthens the connections between instances and their corresponding relationship descriptions, thus improving relation extraction. In the testing phase, the model employs an attention mechanism to calculate the attention scores between the query set and the support set and employs a new classification layer to mitigate overfitting. We conducted experiments on two real-world few-shot relation extraction datasets, and the results demonstrate that our method achieved superior performance on both in-domain and cross-domain datasets, proving the effectiveness of the proposed approach. The code is available at <span><span>https://github.com/xyzew/IACL.git</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"658 \",\"pages\":\"Article 131551\"},\"PeriodicalIF\":6.5000,\"publicationDate\":\"2025-09-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231225022234\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225022234","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

关系提取是自然语言处理中的一项关键任务,经常受到现实场景中样本不足问题的挑战。因此,研究少镜头关系提取具有重要意义。目前,原型网络和基于元学习的参数优化是研究这类问题的主流方法。然而,这些方法在分类过程中仍然面临样本混淆,训练出的模型容易出现过拟合。为了解决这些问题,本文提出了一种基于交互注意的少镜头关系提取方法。在模型训练阶段,我们引入了两种对比学习方法来更好地捕获样本特征并减少样本混淆。对比学习强化了实例之间的联系及其对应的关系描述,从而提高了关系提取。在测试阶段,该模型采用关注机制计算查询集和支持集之间的关注分数,并采用新的分类层来缓解过拟合。实验结果表明,本文方法在域内和跨域数据集上均取得了较好的性能,证明了本文方法的有效性。代码可在https://github.com/xyzew/IACL.git上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Interactive attention and contrastive learning for few-shot relation extraction
Relation extraction is a critical task in natural language processing, often challenged by the problem of insufficient samples in real world scenarios. Therefore, studying few-shot relation extraction is of great significance. Currently, prototype networks and meta-learning-based parameter optimization are the mainstream methods to study this kind of problem. However, these methods still face sample confusion during classification, and the trained models are prone to overfitting. To solve these problems, this paper proposes a few-shot relation extraction method based on interactive attention. During the model training stage, we introduce two contrastive learning approaches to better capture sample features and reduce sample confusion. Contrastive learning strengthens the connections between instances and their corresponding relationship descriptions, thus improving relation extraction. In the testing phase, the model employs an attention mechanism to calculate the attention scores between the query set and the support set and employs a new classification layer to mitigate overfitting. We conducted experiments on two real-world few-shot relation extraction datasets, and the results demonstrate that our method achieved superior performance on both in-domain and cross-domain datasets, proving the effectiveness of the proposed approach. The code is available at https://github.com/xyzew/IACL.git.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信