{"title":"Transformer-based Question Text Generation in the Learning System","authors":"Jiajun Li, Huazhu Song, Jun Li","doi":"10.1145/3529466.3529484","DOIUrl":null,"url":null,"abstract":"Question text generation from the triple in knowledge graph exists some challenges in learning system. One is the generated question text is difficult to be understood; the other is it considers few contexts. Therefore, this paper focuses on question text generation. Based on the traditional Bi-LSTM+Attention network model, we import Transformer model into question generation to get the simple question with some triples. In addition, this paper proposes a method to get the diverse expressions of questions (a variety of expressions of a question), that is, to take advantage of the semantic similarity algorithm based on Bi-LSTM with the help of a question database constructed in advance. Finally, a corresponding comparison experiment is designed, and the experimental results demonstrated that the accuracy of question generation experiment based on the Transformer model is 8.36% higher than the traditional Bi-LSTM + Attention network model.","PeriodicalId":375562,"journal":{"name":"Proceedings of the 2022 6th International Conference on Innovation in Artificial Intelligence","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 6th International Conference on Innovation in Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3529466.3529484","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Question text generation from the triple in knowledge graph exists some challenges in learning system. One is the generated question text is difficult to be understood; the other is it considers few contexts. Therefore, this paper focuses on question text generation. Based on the traditional Bi-LSTM+Attention network model, we import Transformer model into question generation to get the simple question with some triples. In addition, this paper proposes a method to get the diverse expressions of questions (a variety of expressions of a question), that is, to take advantage of the semantic similarity algorithm based on Bi-LSTM with the help of a question database constructed in advance. Finally, a corresponding comparison experiment is designed, and the experimental results demonstrated that the accuracy of question generation experiment based on the Transformer model is 8.36% higher than the traditional Bi-LSTM + Attention network model.