LEMON: A Knowledge-Enhanced, Type-Constrained, and Grammar-Guided Model for Question Generation Over Knowledge Graphs

IF 2.9 3区 教育学 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Sheng Bi;Zeyi Miao;Qizhi Min
{"title":"LEMON: A Knowledge-Enhanced, Type-Constrained, and Grammar-Guided Model for Question Generation Over Knowledge Graphs","authors":"Sheng Bi;Zeyi Miao;Qizhi Min","doi":"10.1109/TLT.2025.3544454","DOIUrl":null,"url":null,"abstract":"The objective of question generation from knowledge graphs (KGQG) is to create coherent and answerable questions from a given subgraph and a specified answer entity. KGQG has garnered significant attention due to its pivotal role in enhancing online education. Encoder–decoder architectures have advanced traditional KGQG approaches. However, these approaches encounter challenges in achieving question diversity and grammatical accuracy. They often suffer from a disconnect between the phrasing of the question and the type of the answer entity, a phenomenon known as semantic drift. To address these challenges, we introduce LEMON, a knowledge-enhanced, type-constrained, and grammar-guided model for KGQG. LEMON enhances the input by integrating entity-related knowledge using heuristic rules, which fosters diversity in question generation. It employs a hierarchical global relation embedding with translation loss to align questions with entity types. In addition, it utilizes a graph-based module to aggregate type information from neighboring nodes. The LEMON model incorporates a type-constrained decoder to generate diverse expressions and improves grammatical accuracy through a syntactic and semantic reward function via reinforcement learning. Evaluations on benchmark datasets demonstrate LEMON's strong competitiveness. The study also examines the impact of question generation quality on question-answering systems, providing guidance for future research endeavors in this domain.","PeriodicalId":49191,"journal":{"name":"IEEE Transactions on Learning Technologies","volume":"18 ","pages":"256-272"},"PeriodicalIF":2.9000,"publicationDate":"2025-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Learning Technologies","FirstCategoryId":"95","ListUrlMain":"https://ieeexplore.ieee.org/document/10897838/","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

The objective of question generation from knowledge graphs (KGQG) is to create coherent and answerable questions from a given subgraph and a specified answer entity. KGQG has garnered significant attention due to its pivotal role in enhancing online education. Encoder–decoder architectures have advanced traditional KGQG approaches. However, these approaches encounter challenges in achieving question diversity and grammatical accuracy. They often suffer from a disconnect between the phrasing of the question and the type of the answer entity, a phenomenon known as semantic drift. To address these challenges, we introduce LEMON, a knowledge-enhanced, type-constrained, and grammar-guided model for KGQG. LEMON enhances the input by integrating entity-related knowledge using heuristic rules, which fosters diversity in question generation. It employs a hierarchical global relation embedding with translation loss to align questions with entity types. In addition, it utilizes a graph-based module to aggregate type information from neighboring nodes. The LEMON model incorporates a type-constrained decoder to generate diverse expressions and improves grammatical accuracy through a syntactic and semantic reward function via reinforcement learning. Evaluations on benchmark datasets demonstrate LEMON's strong competitiveness. The study also examines the impact of question generation quality on question-answering systems, providing guidance for future research endeavors in this domain.
从知识图谱生成问题(KGQG)的目的是根据给定的子图谱和指定的答案实体创建连贯且可回答的问题。KGQG 在加强在线教育方面发挥着举足轻重的作用,因而备受关注。编码器-解码器架构推进了传统的 KGQG 方法。然而,这些方法在实现问题多样性和语法准确性方面遇到了挑战。它们经常会遇到问题措辞与答案实体类型脱节的问题,这种现象被称为语义漂移。为了应对这些挑战,我们引入了 LEMON,这是一种知识增强型、类型受限型和语法指导型 KGQG 模型。LEMON 通过使用启发式规则整合实体相关知识来增强输入,从而促进问题生成的多样性。它采用带有翻译损失的分层全局关系嵌入,使问题与实体类型保持一致。此外,它还利用基于图的模块,从相邻节点汇总类型信息。LEMON 模型包含一个类型受限解码器,可生成多样化的表达,并通过强化学习的句法和语义奖励功能提高语法准确性。在基准数据集上进行的评估证明了 LEMON 的强大竞争力。研究还探讨了问题生成质量对问题解答系统的影响,为该领域未来的研究工作提供了指导。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Learning Technologies
IEEE Transactions on Learning Technologies COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS-
CiteScore
7.50
自引率
5.40%
发文量
82
审稿时长
>12 weeks
期刊介绍: The IEEE Transactions on Learning Technologies covers all advances in learning technologies and their applications, including but not limited to the following topics: innovative online learning systems; intelligent tutors; educational games; simulation systems for education and training; collaborative learning tools; learning with mobile devices; wearable devices and interfaces for learning; personalized and adaptive learning systems; tools for formative and summative assessment; tools for learning analytics and educational data mining; ontologies for learning systems; standards and web services that support learning; authoring tools for learning materials; computer support for peer tutoring; learning via computer-mediated inquiry, field, and lab work; social learning techniques; social networks and infrastructures for learning and knowledge sharing; and creation and management of learning objects.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信