{"title":"Open-Domain Document-Based Automatic QA Models Based on CNN and Attention Mechanism","authors":"Guangjie Zhang, Xumin Fan, Canghong Jin, Ming-hui Wu","doi":"10.1109/ICBK.2019.00051","DOIUrl":null,"url":null,"abstract":"The open domain automatic question answering models have been widely studied in recent years. When dealing with automatic question answering systems, the RNN-based models are the most commonly used models. However we choose the CNN-based model to construct our question answering models, and use the attention mechanism to enhance the performance. We test our models on Microsoft open domain automatic question answering dataset. Experiments show that compared with the models without attention mechanism, our models get the best results. Experiments also show that adding the RNN network in our model can further improve the performance.","PeriodicalId":383917,"journal":{"name":"2019 IEEE International Conference on Big Knowledge (ICBK)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference on Big Knowledge (ICBK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBK.2019.00051","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
The open domain automatic question answering models have been widely studied in recent years. When dealing with automatic question answering systems, the RNN-based models are the most commonly used models. However we choose the CNN-based model to construct our question answering models, and use the attention mechanism to enhance the performance. We test our models on Microsoft open domain automatic question answering dataset. Experiments show that compared with the models without attention mechanism, our models get the best results. Experiments also show that adding the RNN network in our model can further improve the performance.