Xiao Zhang, Heqi Zheng, Heyan Huang, Zewen Chi, Xian-Ling Mao
{"title":"一种知识强化的中国高考阅读理解方法","authors":"Xiao Zhang, Heqi Zheng, Heyan Huang, Zewen Chi, Xian-Ling Mao","doi":"10.1109/ICKG52313.2021.00053","DOIUrl":null,"url":null,"abstract":"Chinese GaoKao Reading Comprehension is a chal-lenging NLP task. It requires strong logical reasoning ability to capture deep semantic relations between the questions and answers. However, most traditional models cannot learn sufficient inference ability, because of the scarcity of Chinese GaoKao reading comprehension data. Intuitively, there are two methods to improve the reading comprehension ability for Chinese GaoKao reading comprehension task. 1). Increase the scale of data. 2). Introduce additional related knowledge. In this paper, we propose a novel method based on adversarial training and knowledge distillation, which can be trained in other knowledge-rich datasets and transferred to the Chinese GaoKao reading comprehension task. Extensive experiments show that our proposed model performs better than the state-of-the-art baselines. The code and the relevant dataset will be publicly avaible.","PeriodicalId":174126,"journal":{"name":"2021 IEEE International Conference on Big Knowledge (ICBK)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Knowledge Enhanced Chinese GaoKao Reading Comprehension Method\",\"authors\":\"Xiao Zhang, Heqi Zheng, Heyan Huang, Zewen Chi, Xian-Ling Mao\",\"doi\":\"10.1109/ICKG52313.2021.00053\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Chinese GaoKao Reading Comprehension is a chal-lenging NLP task. It requires strong logical reasoning ability to capture deep semantic relations between the questions and answers. However, most traditional models cannot learn sufficient inference ability, because of the scarcity of Chinese GaoKao reading comprehension data. Intuitively, there are two methods to improve the reading comprehension ability for Chinese GaoKao reading comprehension task. 1). Increase the scale of data. 2). Introduce additional related knowledge. In this paper, we propose a novel method based on adversarial training and knowledge distillation, which can be trained in other knowledge-rich datasets and transferred to the Chinese GaoKao reading comprehension task. Extensive experiments show that our proposed model performs better than the state-of-the-art baselines. The code and the relevant dataset will be publicly avaible.\",\"PeriodicalId\":174126,\"journal\":{\"name\":\"2021 IEEE International Conference on Big Knowledge (ICBK)\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Conference on Big Knowledge (ICBK)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICKG52313.2021.00053\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Big Knowledge (ICBK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICKG52313.2021.00053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Knowledge Enhanced Chinese GaoKao Reading Comprehension Method
Chinese GaoKao Reading Comprehension is a chal-lenging NLP task. It requires strong logical reasoning ability to capture deep semantic relations between the questions and answers. However, most traditional models cannot learn sufficient inference ability, because of the scarcity of Chinese GaoKao reading comprehension data. Intuitively, there are two methods to improve the reading comprehension ability for Chinese GaoKao reading comprehension task. 1). Increase the scale of data. 2). Introduce additional related knowledge. In this paper, we propose a novel method based on adversarial training and knowledge distillation, which can be trained in other knowledge-rich datasets and transferred to the Chinese GaoKao reading comprehension task. Extensive experiments show that our proposed model performs better than the state-of-the-art baselines. The code and the relevant dataset will be publicly avaible.