Attention-based neural network for short-text question answering

Yongxin Peng, B. Liu
{"title":"Attention-based neural network for short-text question answering","authors":"Yongxin Peng, B. Liu","doi":"10.1145/3234804.3234813","DOIUrl":null,"url":null,"abstract":"Question answering (QA) has been a popular topic in information retrieval tasks. Several studies rely on classifiers with a large number of handcrafted syntactic and semantic features and various external resources, such as WordNet, which is an English dictionary based on cognitive linguistics. Deep learning approaches have recently gained advanced performance in QA. However, these approaches have to be combined with additional features, such as word overlap. In this work, the factoid query answer retrieval task is introduced; moreover, the effective solving of this task under a deep learning framework is investigated. An attention-based convolutional neural network model is proposed to obtain word- and phrase-level interactive information and generate correct probability to re-rank candidate answers. The performance of the proposed model is compared with other models using the popular benchmark text retrieval conference QA data. Results show that the proposed model can obtain a significant performance improvement.","PeriodicalId":118446,"journal":{"name":"International Conference on Deep Learning Technologies","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Deep Learning Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3234804.3234813","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

Question answering (QA) has been a popular topic in information retrieval tasks. Several studies rely on classifiers with a large number of handcrafted syntactic and semantic features and various external resources, such as WordNet, which is an English dictionary based on cognitive linguistics. Deep learning approaches have recently gained advanced performance in QA. However, these approaches have to be combined with additional features, such as word overlap. In this work, the factoid query answer retrieval task is introduced; moreover, the effective solving of this task under a deep learning framework is investigated. An attention-based convolutional neural network model is proposed to obtain word- and phrase-level interactive information and generate correct probability to re-rank candidate answers. The performance of the proposed model is compared with other models using the popular benchmark text retrieval conference QA data. Results show that the proposed model can obtain a significant performance improvement.
基于注意力的短文本问答神经网络
问答(QA)是信息检索任务中的一个热门话题。一些研究依赖于具有大量手工制作的句法和语义特征的分类器和各种外部资源,例如基于认知语言学的英语词典WordNet。深度学习方法最近在QA中获得了先进的性能。然而,这些方法必须与其他特性结合起来,比如单词重叠。在这项工作中,引入了事实查询答案检索任务;此外,研究了在深度学习框架下该任务的有效求解。提出了一种基于注意的卷积神经网络模型来获取词级和短语级的交互信息,并生成正确的概率对候选答案进行重新排序。使用流行的基准文本检索会议QA数据与其他模型的性能进行了比较。结果表明,该模型的性能得到了显著提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信