{"title":"基于约束的神经问题生成,基于序列到序列和转换模型的隐私策略文档","authors":"Deepti Lamba, W. Hsu","doi":"10.18178/ijke.2021.7.2.135","DOIUrl":null,"url":null,"abstract":"This paper presents the results of constraint-based automatic question generation for paragraphs from privacy policy documents. Existing work on question generation uses sequence-to-sequence and transformer-based approaches. This work introduces constraints to sequence-to-sequence and transformer based T5 model. The notion behind this work is that providing the deep learning models with additional background domain information can aid the system in learning useful patterns. This work presents three kinds of constraints – logical, empirical, and data-based constraint. The constraints are incorporated in the deep learning models by introducing additional penalty or reward terms in the loss function. Automatic evaluation results show that our approach significantly outperforms the state-of-the-art models.","PeriodicalId":88527,"journal":{"name":"International journal of knowledge engineering and soft data paradigms","volume":"98 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Constraint-Based Neural Question Generation Using Sequence-to-Sequence and Transformer Models for Privacy Policy Documents\",\"authors\":\"Deepti Lamba, W. Hsu\",\"doi\":\"10.18178/ijke.2021.7.2.135\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents the results of constraint-based automatic question generation for paragraphs from privacy policy documents. Existing work on question generation uses sequence-to-sequence and transformer-based approaches. This work introduces constraints to sequence-to-sequence and transformer based T5 model. The notion behind this work is that providing the deep learning models with additional background domain information can aid the system in learning useful patterns. This work presents three kinds of constraints – logical, empirical, and data-based constraint. The constraints are incorporated in the deep learning models by introducing additional penalty or reward terms in the loss function. Automatic evaluation results show that our approach significantly outperforms the state-of-the-art models.\",\"PeriodicalId\":88527,\"journal\":{\"name\":\"International journal of knowledge engineering and soft data paradigms\",\"volume\":\"98 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of knowledge engineering and soft data paradigms\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.18178/ijke.2021.7.2.135\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of knowledge engineering and soft data paradigms","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18178/ijke.2021.7.2.135","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Constraint-Based Neural Question Generation Using Sequence-to-Sequence and Transformer Models for Privacy Policy Documents
This paper presents the results of constraint-based automatic question generation for paragraphs from privacy policy documents. Existing work on question generation uses sequence-to-sequence and transformer-based approaches. This work introduces constraints to sequence-to-sequence and transformer based T5 model. The notion behind this work is that providing the deep learning models with additional background domain information can aid the system in learning useful patterns. This work presents three kinds of constraints – logical, empirical, and data-based constraint. The constraints are incorporated in the deep learning models by introducing additional penalty or reward terms in the loss function. Automatic evaluation results show that our approach significantly outperforms the state-of-the-art models.