{"title":"Generalized Zero-Shot Text Classification via Inter-Class Relationship","authors":"Yiwen Zhang, Caixia Yuan, Xiaojie Wang","doi":"10.1109/CCIS53392.2021.9754674","DOIUrl":null,"url":null,"abstract":"Generalized zero-shot text classification (GZSTC) aims to classify textual instances from both previously seen classes and novel classes which are totally unseen during training. However, previous supervised metric learning methods cause severe domain bias problem. To tackle this problem, we propose a GZSTC method to reduce the gap from the fully trained seen domain and unaware unseen domain using relationship. Concretely, the proposed model gains beneficial experiences through multiple mimic GZSTC tasks during training. In every mimic GZSTC task, the model explicitly takes advantage of the relationship between the mimetic seen classes and unseen classes, which generalizes well on the real testing unseen classes. We extensively evaluate the performance on two GZSTC datasets. The results show that our method can alleviate the domain bias problem and outperform the state-of-the-arts by a large margin.","PeriodicalId":191226,"journal":{"name":"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCIS53392.2021.9754674","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Generalized zero-shot text classification (GZSTC) aims to classify textual instances from both previously seen classes and novel classes which are totally unseen during training. However, previous supervised metric learning methods cause severe domain bias problem. To tackle this problem, we propose a GZSTC method to reduce the gap from the fully trained seen domain and unaware unseen domain using relationship. Concretely, the proposed model gains beneficial experiences through multiple mimic GZSTC tasks during training. In every mimic GZSTC task, the model explicitly takes advantage of the relationship between the mimetic seen classes and unseen classes, which generalizes well on the real testing unseen classes. We extensively evaluate the performance on two GZSTC datasets. The results show that our method can alleviate the domain bias problem and outperform the state-of-the-arts by a large margin.