{"title":"基于Bi-LSTM和自注意机制的问答系统研究","authors":"Hao Xiang, J. Gu","doi":"10.1109/ICIEA49774.2020.9101985","DOIUrl":null,"url":null,"abstract":"With the development of artificial intelligence technology, intelligent question an-swering has become a hot research direction in the field of natural language pro-cessing. This paper proposes a question answering method based on Bi-LSTM and self-attention mechanism model. This method uses Bi-LSTM to encode and align the question and answer respectively, then uses self-attention to obtain the relationship between keywords, and finally performs softmax through the fully connected layer to obtain the similarity between the question and answer. Finally, in the experiment, compared with the traditional attention model, the accuracy rate of this model was increased by 1.6%, and the recall rate was increased by 1.5%.","PeriodicalId":306461,"journal":{"name":"2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA)","volume":"56 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Research on Question Answering System Based on Bi-LSTM and Self-attention Mechanism\",\"authors\":\"Hao Xiang, J. Gu\",\"doi\":\"10.1109/ICIEA49774.2020.9101985\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the development of artificial intelligence technology, intelligent question an-swering has become a hot research direction in the field of natural language pro-cessing. This paper proposes a question answering method based on Bi-LSTM and self-attention mechanism model. This method uses Bi-LSTM to encode and align the question and answer respectively, then uses self-attention to obtain the relationship between keywords, and finally performs softmax through the fully connected layer to obtain the similarity between the question and answer. Finally, in the experiment, compared with the traditional attention model, the accuracy rate of this model was increased by 1.6%, and the recall rate was increased by 1.5%.\",\"PeriodicalId\":306461,\"journal\":{\"name\":\"2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA)\",\"volume\":\"56 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIEA49774.2020.9101985\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 7th International Conference on Industrial Engineering and Applications (ICIEA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIEA49774.2020.9101985","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Research on Question Answering System Based on Bi-LSTM and Self-attention Mechanism
With the development of artificial intelligence technology, intelligent question an-swering has become a hot research direction in the field of natural language pro-cessing. This paper proposes a question answering method based on Bi-LSTM and self-attention mechanism model. This method uses Bi-LSTM to encode and align the question and answer respectively, then uses self-attention to obtain the relationship between keywords, and finally performs softmax through the fully connected layer to obtain the similarity between the question and answer. Finally, in the experiment, compared with the traditional attention model, the accuracy rate of this model was increased by 1.6%, and the recall rate was increased by 1.5%.