Attention-based Convolutional Neural Network for Answer Selection using BERT

H. Khorashadizadeh, R. Monsefi, Shima Foolad
{"title":"Attention-based Convolutional Neural Network for Answer Selection using BERT","authors":"H. Khorashadizadeh, R. Monsefi, Shima Foolad","doi":"10.1109/CFIS49607.2020.9238669","DOIUrl":null,"url":null,"abstract":"Question answering is at the heart of natural language processing and is composed of two sections: Reading Comprehension and Answer Selection. Prior to deep learning, all natural language processing solutions including Question Answering were based on statistical methods and researchers generated set of features based on text input. Answer Selection is a fundamental task in Question Answering, also a tough one because of the complicated semantic relations between questions and answers. Attention is a mechanism that has revolutionized deep learning community. Leveraging pretrained language models have made a breakthrough in most natural language processing tasks. Bert is one of the top pretrained deep language models that has achieved state-of-the-art on an extensive area of nlp tasks. In this paper we utilize an attention-based convolutional neural network. First, we employ BERT, a state-of-the-art pre-trained contextual as the embedding layer. Second, we enhance the model by adding some more attentive features. We evaluate the performance of our model on WikiQA dataset. Our experiments show that our model is superior to many other answer-selection models.","PeriodicalId":128323,"journal":{"name":"2020 8th Iranian Joint Congress on Fuzzy and intelligent Systems (CFIS)","volume":"125 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 8th Iranian Joint Congress on Fuzzy and intelligent Systems (CFIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CFIS49607.2020.9238669","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Question answering is at the heart of natural language processing and is composed of two sections: Reading Comprehension and Answer Selection. Prior to deep learning, all natural language processing solutions including Question Answering were based on statistical methods and researchers generated set of features based on text input. Answer Selection is a fundamental task in Question Answering, also a tough one because of the complicated semantic relations between questions and answers. Attention is a mechanism that has revolutionized deep learning community. Leveraging pretrained language models have made a breakthrough in most natural language processing tasks. Bert is one of the top pretrained deep language models that has achieved state-of-the-art on an extensive area of nlp tasks. In this paper we utilize an attention-based convolutional neural network. First, we employ BERT, a state-of-the-art pre-trained contextual as the embedding layer. Second, we enhance the model by adding some more attentive features. We evaluate the performance of our model on WikiQA dataset. Our experiments show that our model is superior to many other answer-selection models.
基于BERT的基于注意力的卷积神经网络答案选择
问题回答是自然语言处理的核心,由两个部分组成:阅读理解和答案选择。在深度学习之前,包括Question answer在内的所有自然语言处理解决方案都是基于统计方法,研究人员基于文本输入生成一组特征。答案选择是问答中的一项基本任务,也是一项难点任务,因为问题与答案之间存在复杂的语义关系。注意力是一种彻底改变深度学习社区的机制。利用预训练语言模型在大多数自然语言处理任务中取得了突破。Bert是顶级的预训练深度语言模型之一,在广泛的nlp任务领域取得了最先进的成就。在本文中,我们使用基于注意的卷积神经网络。首先,我们使用BERT,一种最先进的预训练上下文作为嵌入层。其次,我们通过添加一些更关注的特征来增强模型。我们在WikiQA数据集上评估我们的模型的性能。我们的实验表明,我们的模型优于许多其他的答案选择模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信