{"title":"LDRC: Long-tail Distantly Supervised Relation Extraction via Contrastive Learning","authors":"Tingwei Li, Zhi Wang","doi":"10.1145/3583788.3583804","DOIUrl":null,"url":null,"abstract":"Long-tail problem is one of the major challenges in distantly supervised relation extraction. Some recent works on the long-tail problem attempt to transfer knowledge from data-rich and semantically similar head classes to data-poor tail classes using a relation hierarchical tree. These methods, however, are based on the assumption that long-tail and head relations have a strong correlation, which does not always hold true, and the model’s ability to learn long-tail relations is essentially not improved. In this paper, a novel joint learning framework that combines relation extraction and contrastive learning is proposed, allowing the model to directly learn the subtle differences between different categories to improve long-tail relation extraction. Experimental results show that our proposed model outperforms the current state-of-the-art (SOTA) model on various mainstream datasets.","PeriodicalId":292167,"journal":{"name":"Proceedings of the 2023 7th International Conference on Machine Learning and Soft Computing","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 7th International Conference on Machine Learning and Soft Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3583788.3583804","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Long-tail problem is one of the major challenges in distantly supervised relation extraction. Some recent works on the long-tail problem attempt to transfer knowledge from data-rich and semantically similar head classes to data-poor tail classes using a relation hierarchical tree. These methods, however, are based on the assumption that long-tail and head relations have a strong correlation, which does not always hold true, and the model’s ability to learn long-tail relations is essentially not improved. In this paper, a novel joint learning framework that combines relation extraction and contrastive learning is proposed, allowing the model to directly learn the subtle differences between different categories to improve long-tail relation extraction. Experimental results show that our proposed model outperforms the current state-of-the-art (SOTA) model on various mainstream datasets.