基于约束的神经问题生成,基于序列到序列和转换模型的隐私策略文档

Deepti Lamba, W. Hsu
{"title":"基于约束的神经问题生成,基于序列到序列和转换模型的隐私策略文档","authors":"Deepti Lamba, W. Hsu","doi":"10.18178/ijke.2021.7.2.135","DOIUrl":null,"url":null,"abstract":"This paper presents the results of constraint-based automatic question generation for paragraphs from privacy policy documents. Existing work on question generation uses sequence-to-sequence and transformer-based approaches. This work introduces constraints to sequence-to-sequence and transformer based T5 model. The notion behind this work is that providing the deep learning models with additional background domain information can aid the system in learning useful patterns. This work presents three kinds of constraints – logical, empirical, and data-based constraint. The constraints are incorporated in the deep learning models by introducing additional penalty or reward terms in the loss function. Automatic evaluation results show that our approach significantly outperforms the state-of-the-art models.","PeriodicalId":88527,"journal":{"name":"International journal of knowledge engineering and soft data paradigms","volume":"98 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Constraint-Based Neural Question Generation Using Sequence-to-Sequence and Transformer Models for Privacy Policy Documents\",\"authors\":\"Deepti Lamba, W. Hsu\",\"doi\":\"10.18178/ijke.2021.7.2.135\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents the results of constraint-based automatic question generation for paragraphs from privacy policy documents. Existing work on question generation uses sequence-to-sequence and transformer-based approaches. This work introduces constraints to sequence-to-sequence and transformer based T5 model. The notion behind this work is that providing the deep learning models with additional background domain information can aid the system in learning useful patterns. This work presents three kinds of constraints – logical, empirical, and data-based constraint. The constraints are incorporated in the deep learning models by introducing additional penalty or reward terms in the loss function. Automatic evaluation results show that our approach significantly outperforms the state-of-the-art models.\",\"PeriodicalId\":88527,\"journal\":{\"name\":\"International journal of knowledge engineering and soft data paradigms\",\"volume\":\"98 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of knowledge engineering and soft data paradigms\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.18178/ijke.2021.7.2.135\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of knowledge engineering and soft data paradigms","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18178/ijke.2021.7.2.135","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本文给出了基于约束的隐私政策文档段落自动问题生成的结果。现有的问题生成工作使用序列到序列和基于转换的方法。这项工作引入了序列对序列和基于变压器的T5模型的约束。这项工作背后的概念是,为深度学习模型提供额外的背景域信息可以帮助系统学习有用的模式。这项工作提出了三种约束——逻辑约束、经验约束和基于数据的约束。通过在损失函数中引入额外的惩罚或奖励项,约束被纳入深度学习模型。自动评估结果表明,我们的方法明显优于最先进的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Constraint-Based Neural Question Generation Using Sequence-to-Sequence and Transformer Models for Privacy Policy Documents
This paper presents the results of constraint-based automatic question generation for paragraphs from privacy policy documents. Existing work on question generation uses sequence-to-sequence and transformer-based approaches. This work introduces constraints to sequence-to-sequence and transformer based T5 model. The notion behind this work is that providing the deep learning models with additional background domain information can aid the system in learning useful patterns. This work presents three kinds of constraints – logical, empirical, and data-based constraint. The constraints are incorporated in the deep learning models by introducing additional penalty or reward terms in the loss function. Automatic evaluation results show that our approach significantly outperforms the state-of-the-art models.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信