{"title":"Multi-task Learning for Relation Extraction","authors":"Kai Zhou, Xiangfeng Luo, Hongya Wang, R. Xu","doi":"10.1109/ICTAI.2019.00210","DOIUrl":null,"url":null,"abstract":"Distantly supervised relation extraction leverages knowledge bases to label training data automatically. However, distant supervision may introduce incorrect labels, which harm the performance. Many efforts have been devoted to tackling this problem, but most of them treat relation extraction as a simple classification task. As a result, they ignore useful information that comes from related tasks, i.e., dependency parsing and entity type classification. In this paper, we first propose a novel Multi-Task learning framework for Relation Extraction (MTRE). We employ dependency parsing and entity type classification as auxiliary tasks and relation extraction as the target task. We learn these tasks simultaneously from training instances to take advantage of inductive transfer between auxiliary tasks and the target task. Then we construct a hierarchical neural network, which incorporates dependency and entity representations from auxiliary tasks into a more robust relation representation against the noisy labels. The experimental results demonstrate that our model improves the predictive performance substantially over single-task learning baselines.","PeriodicalId":346657,"journal":{"name":"2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTAI.2019.00210","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Distantly supervised relation extraction leverages knowledge bases to label training data automatically. However, distant supervision may introduce incorrect labels, which harm the performance. Many efforts have been devoted to tackling this problem, but most of them treat relation extraction as a simple classification task. As a result, they ignore useful information that comes from related tasks, i.e., dependency parsing and entity type classification. In this paper, we first propose a novel Multi-Task learning framework for Relation Extraction (MTRE). We employ dependency parsing and entity type classification as auxiliary tasks and relation extraction as the target task. We learn these tasks simultaneously from training instances to take advantage of inductive transfer between auxiliary tasks and the target task. Then we construct a hierarchical neural network, which incorporates dependency and entity representations from auxiliary tasks into a more robust relation representation against the noisy labels. The experimental results demonstrate that our model improves the predictive performance substantially over single-task learning baselines.