{"title":"基于BERT的基于注意力的卷积神经网络答案选择","authors":"H. Khorashadizadeh, R. Monsefi, Shima Foolad","doi":"10.1109/CFIS49607.2020.9238669","DOIUrl":null,"url":null,"abstract":"Question answering is at the heart of natural language processing and is composed of two sections: Reading Comprehension and Answer Selection. Prior to deep learning, all natural language processing solutions including Question Answering were based on statistical methods and researchers generated set of features based on text input. Answer Selection is a fundamental task in Question Answering, also a tough one because of the complicated semantic relations between questions and answers. Attention is a mechanism that has revolutionized deep learning community. Leveraging pretrained language models have made a breakthrough in most natural language processing tasks. Bert is one of the top pretrained deep language models that has achieved state-of-the-art on an extensive area of nlp tasks. In this paper we utilize an attention-based convolutional neural network. First, we employ BERT, a state-of-the-art pre-trained contextual as the embedding layer. Second, we enhance the model by adding some more attentive features. We evaluate the performance of our model on WikiQA dataset. Our experiments show that our model is superior to many other answer-selection models.","PeriodicalId":128323,"journal":{"name":"2020 8th Iranian Joint Congress on Fuzzy and intelligent Systems (CFIS)","volume":"125 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Attention-based Convolutional Neural Network for Answer Selection using BERT\",\"authors\":\"H. Khorashadizadeh, R. Monsefi, Shima Foolad\",\"doi\":\"10.1109/CFIS49607.2020.9238669\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Question answering is at the heart of natural language processing and is composed of two sections: Reading Comprehension and Answer Selection. Prior to deep learning, all natural language processing solutions including Question Answering were based on statistical methods and researchers generated set of features based on text input. Answer Selection is a fundamental task in Question Answering, also a tough one because of the complicated semantic relations between questions and answers. Attention is a mechanism that has revolutionized deep learning community. Leveraging pretrained language models have made a breakthrough in most natural language processing tasks. Bert is one of the top pretrained deep language models that has achieved state-of-the-art on an extensive area of nlp tasks. In this paper we utilize an attention-based convolutional neural network. First, we employ BERT, a state-of-the-art pre-trained contextual as the embedding layer. Second, we enhance the model by adding some more attentive features. We evaluate the performance of our model on WikiQA dataset. Our experiments show that our model is superior to many other answer-selection models.\",\"PeriodicalId\":128323,\"journal\":{\"name\":\"2020 8th Iranian Joint Congress on Fuzzy and intelligent Systems (CFIS)\",\"volume\":\"125 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 8th Iranian Joint Congress on Fuzzy and intelligent Systems (CFIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CFIS49607.2020.9238669\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 8th Iranian Joint Congress on Fuzzy and intelligent Systems (CFIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CFIS49607.2020.9238669","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Attention-based Convolutional Neural Network for Answer Selection using BERT
Question answering is at the heart of natural language processing and is composed of two sections: Reading Comprehension and Answer Selection. Prior to deep learning, all natural language processing solutions including Question Answering were based on statistical methods and researchers generated set of features based on text input. Answer Selection is a fundamental task in Question Answering, also a tough one because of the complicated semantic relations between questions and answers. Attention is a mechanism that has revolutionized deep learning community. Leveraging pretrained language models have made a breakthrough in most natural language processing tasks. Bert is one of the top pretrained deep language models that has achieved state-of-the-art on an extensive area of nlp tasks. In this paper we utilize an attention-based convolutional neural network. First, we employ BERT, a state-of-the-art pre-trained contextual as the embedding layer. Second, we enhance the model by adding some more attentive features. We evaluate the performance of our model on WikiQA dataset. Our experiments show that our model is superior to many other answer-selection models.