聚焦注意力网络的答案排名

Yufei Xie, Shuchun Liu, Tangren Yao, Yao Peng, Zhao Lu
{"title":"聚焦注意力网络的答案排名","authors":"Yufei Xie, Shuchun Liu, Tangren Yao, Yao Peng, Zhao Lu","doi":"10.1145/3308558.3313518","DOIUrl":null,"url":null,"abstract":"Answer ranking is an important task in Community Question Answering (CQA), by which “Good” answers should be ranked in the front of “Bad” or “Potentially Useful” answers. The state of the art is the attention-based classification framework that learns the mapping between the questions and the answers. However, we observe that existing attention-based methods perform poorly on complicated question-answer pairs. One major reason is that existing methods cannot get accurate alignments between questions and answers for such pairs. We call the phenomenon “attention divergence”. In this paper, we propose a new attention mechanism, called Focusing Attention Network(FAN), which can automatically draw back the divergent attention by adding the semantic, and metadata features. Our Model can focus on the most important part of the sentence and therefore improve the answer ranking performance. Experimental results on the CQA dataset of SemEval-2016 and SemEval-2017 demonstrate that our method respectively attains 79.38 and 88.72 on MAP and outperforms the Top-1 system in the shared task by 0.19 and 0.29.","PeriodicalId":23013,"journal":{"name":"The World Wide Web Conference","volume":"70 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2019-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Focusing Attention Network for Answer Ranking\",\"authors\":\"Yufei Xie, Shuchun Liu, Tangren Yao, Yao Peng, Zhao Lu\",\"doi\":\"10.1145/3308558.3313518\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Answer ranking is an important task in Community Question Answering (CQA), by which “Good” answers should be ranked in the front of “Bad” or “Potentially Useful” answers. The state of the art is the attention-based classification framework that learns the mapping between the questions and the answers. However, we observe that existing attention-based methods perform poorly on complicated question-answer pairs. One major reason is that existing methods cannot get accurate alignments between questions and answers for such pairs. We call the phenomenon “attention divergence”. In this paper, we propose a new attention mechanism, called Focusing Attention Network(FAN), which can automatically draw back the divergent attention by adding the semantic, and metadata features. Our Model can focus on the most important part of the sentence and therefore improve the answer ranking performance. Experimental results on the CQA dataset of SemEval-2016 and SemEval-2017 demonstrate that our method respectively attains 79.38 and 88.72 on MAP and outperforms the Top-1 system in the shared task by 0.19 and 0.29.\",\"PeriodicalId\":23013,\"journal\":{\"name\":\"The World Wide Web Conference\",\"volume\":\"70 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-05-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The World Wide Web Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3308558.3313518\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The World Wide Web Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3308558.3313518","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

答案排序是社区问答(CQA)中的一项重要任务,“好的”答案应该排在“坏的”或“潜在有用的”答案的前面。最先进的技术是基于注意力的分类框架,它可以学习问题和答案之间的映射。然而,我们观察到现有的基于注意力的方法在复杂的问答对上表现不佳。一个主要的原因是现有的方法不能得到这种对的问题和答案之间的精确对齐。我们称这种现象为“注意力分化”。在本文中,我们提出了一种新的注意力机制,即聚焦注意力网络(FAN),它可以通过添加语义特征和元数据特征来自动收回分散的注意力。我们的模型可以专注于句子中最重要的部分,从而提高答案排名的性能。在SemEval-2016和SemEval-2017的CQA数据集上的实验结果表明,我们的方法在MAP上分别达到79.38和88.72,在共享任务上比Top-1系统分别高出0.19和0.29。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Focusing Attention Network for Answer Ranking
Answer ranking is an important task in Community Question Answering (CQA), by which “Good” answers should be ranked in the front of “Bad” or “Potentially Useful” answers. The state of the art is the attention-based classification framework that learns the mapping between the questions and the answers. However, we observe that existing attention-based methods perform poorly on complicated question-answer pairs. One major reason is that existing methods cannot get accurate alignments between questions and answers for such pairs. We call the phenomenon “attention divergence”. In this paper, we propose a new attention mechanism, called Focusing Attention Network(FAN), which can automatically draw back the divergent attention by adding the semantic, and metadata features. Our Model can focus on the most important part of the sentence and therefore improve the answer ranking performance. Experimental results on the CQA dataset of SemEval-2016 and SemEval-2017 demonstrate that our method respectively attains 79.38 and 88.72 on MAP and outperforms the Top-1 system in the shared task by 0.19 and 0.29.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信