{"title":"学习判别和无偏表示的少镜头关系提取","authors":"Jiale Han, Bo Cheng, Guoshun Nan","doi":"10.1145/3459637.3482268","DOIUrl":null,"url":null,"abstract":"Few-shot relation extraction (FSRE) aims to predict the relation for a pair of entities in a sentence by exploring a few labeled instances for each relation type. Current methods mainly rely on meta-learning to learn generalized representations by optimizing the network parameters based on various collections of tasks sampled from training data. However, these methods may suffer from two main issues. 1) Insufficient supervision of meta-learning to learn discriminative representations on very few training instances, which are sampled from a large amount of base class data. 2) Spurious correlations between entities and relation types due to the biased training procedure that focuses more on entity pair rather than context. To learn more discriminative and unbiased representations for FSRE, this paper proposes a two-stage approach via supervised contrastive learning and sentence- and entity-level prototypical networks. In the first (pre-training) stage, we introduce a supervised contrastive pre-training method, which is able to yield more discriminative representations by learning from the entire training instances, such that the semantically related representations are close to each other, and far away otherwise. In the second (meta-learning) stage, we propose a novel sentence- and entity-level prototypical network equipped with fine-grained feature-wise fusion strategy to learn unbiased representations, where the networks are initialized with the parameters trained in the first stage. Specifically, the proposed network consists of a sentence branch and an entity branch, taking entire sentences and entity mentions as inputs, respectively. The entity branch explicitly captures the correlation between entity pairs and relations, and then dynamically adjusts the sentence branch's prediction distributions. By doing so, the spurious correlations issue caused by biased training samples can be properly mitigated. Extensive experiments on two FSRE benchmarks demonstrate the effectiveness of our approach.","PeriodicalId":405296,"journal":{"name":"Proceedings of the 30th ACM International Conference on Information & Knowledge Management","volume":"34 3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Learning Discriminative and Unbiased Representations for Few-Shot Relation Extraction\",\"authors\":\"Jiale Han, Bo Cheng, Guoshun Nan\",\"doi\":\"10.1145/3459637.3482268\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Few-shot relation extraction (FSRE) aims to predict the relation for a pair of entities in a sentence by exploring a few labeled instances for each relation type. Current methods mainly rely on meta-learning to learn generalized representations by optimizing the network parameters based on various collections of tasks sampled from training data. However, these methods may suffer from two main issues. 1) Insufficient supervision of meta-learning to learn discriminative representations on very few training instances, which are sampled from a large amount of base class data. 2) Spurious correlations between entities and relation types due to the biased training procedure that focuses more on entity pair rather than context. To learn more discriminative and unbiased representations for FSRE, this paper proposes a two-stage approach via supervised contrastive learning and sentence- and entity-level prototypical networks. In the first (pre-training) stage, we introduce a supervised contrastive pre-training method, which is able to yield more discriminative representations by learning from the entire training instances, such that the semantically related representations are close to each other, and far away otherwise. In the second (meta-learning) stage, we propose a novel sentence- and entity-level prototypical network equipped with fine-grained feature-wise fusion strategy to learn unbiased representations, where the networks are initialized with the parameters trained in the first stage. Specifically, the proposed network consists of a sentence branch and an entity branch, taking entire sentences and entity mentions as inputs, respectively. The entity branch explicitly captures the correlation between entity pairs and relations, and then dynamically adjusts the sentence branch's prediction distributions. By doing so, the spurious correlations issue caused by biased training samples can be properly mitigated. Extensive experiments on two FSRE benchmarks demonstrate the effectiveness of our approach.\",\"PeriodicalId\":405296,\"journal\":{\"name\":\"Proceedings of the 30th ACM International Conference on Information & Knowledge Management\",\"volume\":\"34 3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 30th ACM International Conference on Information & Knowledge Management\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3459637.3482268\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 30th ACM International Conference on Information & Knowledge Management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3459637.3482268","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Learning Discriminative and Unbiased Representations for Few-Shot Relation Extraction
Few-shot relation extraction (FSRE) aims to predict the relation for a pair of entities in a sentence by exploring a few labeled instances for each relation type. Current methods mainly rely on meta-learning to learn generalized representations by optimizing the network parameters based on various collections of tasks sampled from training data. However, these methods may suffer from two main issues. 1) Insufficient supervision of meta-learning to learn discriminative representations on very few training instances, which are sampled from a large amount of base class data. 2) Spurious correlations between entities and relation types due to the biased training procedure that focuses more on entity pair rather than context. To learn more discriminative and unbiased representations for FSRE, this paper proposes a two-stage approach via supervised contrastive learning and sentence- and entity-level prototypical networks. In the first (pre-training) stage, we introduce a supervised contrastive pre-training method, which is able to yield more discriminative representations by learning from the entire training instances, such that the semantically related representations are close to each other, and far away otherwise. In the second (meta-learning) stage, we propose a novel sentence- and entity-level prototypical network equipped with fine-grained feature-wise fusion strategy to learn unbiased representations, where the networks are initialized with the parameters trained in the first stage. Specifically, the proposed network consists of a sentence branch and an entity branch, taking entire sentences and entity mentions as inputs, respectively. The entity branch explicitly captures the correlation between entity pairs and relations, and then dynamically adjusts the sentence branch's prediction distributions. By doing so, the spurious correlations issue caused by biased training samples can be properly mitigated. Extensive experiments on two FSRE benchmarks demonstrate the effectiveness of our approach.