GLAF: Global-and-Local Attention Flow Model for Question Answering

Shao-Hua Sun
{"title":"GLAF: Global-and-Local Attention Flow Model for Question Answering","authors":"Shao-Hua Sun","doi":"10.1145/3571560.3571570","DOIUrl":null,"url":null,"abstract":"Question answering is one of the well-studied tasks in the natural language processing(NLP) community, which aims to secure an answer span from a given document and query. Previous attempts decomposed this task into two subtask, i.e., understanding the semantic information of the given document and query, then finding a reasonable textual span within the document as the corresponding answer. However, one of the major drawbacks of the previous works is lack of extracting sufficient semantics that is buried within the input. To alleviate the issue above, in this paper, we propose a global-local attention flow model to take advantage of the semantic features from different aspects and reduce the redundancy of model encoder. Experimental results on the SQUAD dataset shows that our model outperforms the baseline models, which proves the effectiveness of the proposed method.","PeriodicalId":143909,"journal":{"name":"Proceedings of the 6th International Conference on Advances in Artificial Intelligence","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 6th International Conference on Advances in Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3571560.3571570","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Question answering is one of the well-studied tasks in the natural language processing(NLP) community, which aims to secure an answer span from a given document and query. Previous attempts decomposed this task into two subtask, i.e., understanding the semantic information of the given document and query, then finding a reasonable textual span within the document as the corresponding answer. However, one of the major drawbacks of the previous works is lack of extracting sufficient semantics that is buried within the input. To alleviate the issue above, in this paper, we propose a global-local attention flow model to take advantage of the semantic features from different aspects and reduce the redundancy of model encoder. Experimental results on the SQUAD dataset shows that our model outperforms the baseline models, which proves the effectiveness of the proposed method.
问题回答的全局和局部注意流模型
问题回答是自然语言处理(NLP)领域中被研究得很好的任务之一,它旨在从给定的文档和查询中获得答案。之前的尝试将此任务分解为两个子任务,即理解给定文档和查询的语义信息,然后在文档中找到合理的文本跨度作为相应的答案。然而,先前工作的主要缺点之一是缺乏提取足够的语义,这些语义隐藏在输入中。为了解决上述问题,本文提出了一种全局-局部注意流模型,利用了不同方面的语义特征,减少了模型编码器的冗余。在SQUAD数据集上的实验结果表明,我们的模型优于基线模型,证明了所提方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信