基于双层注意机制的语义匹配模型

Zhenzhen Hou, Xiaodong Cai, Si Chen, Bo Li
{"title":"基于双层注意机制的语义匹配模型","authors":"Zhenzhen Hou, Xiaodong Cai, Si Chen, Bo Li","doi":"10.1109/ICIASE45644.2019.9074041","DOIUrl":null,"url":null,"abstract":"In community question and answering (Q&A) systems, due to the diversity of words and syntactic structure, matching question pairs representing similar meaning is a challenging task. A novel model based on dual-layer attention mechanism for semantic matching is proposed in this work. Firstly, an attention-based preprocessing method is used on the word representation layer to reduce redundant information. Secondly, a bilateral multiple perspectives attention mechanism is utilized on the context representation layer to obtain more interactive information. Finally, the obtained information is passed through a Bi-directional Long Short Term Memory Network (BiLSTM). Then the obtained final time steps of the two sequences are combined for prediction. The experimental results show that the accuracy of the proposed model in our self-defined Chinese dataset is up to 95.54% and also 88.91% with Quora dataset. It outperforms the existing advanced benchmark models. The model also provides stability and scalability to natural language inference tasks with the accuracy of 87.4% in the Stanford Natural Language Inference (SNLI) dataset.","PeriodicalId":206741,"journal":{"name":"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)","volume":"45 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A model based on dual-layer attention mechanism for semantic matching\",\"authors\":\"Zhenzhen Hou, Xiaodong Cai, Si Chen, Bo Li\",\"doi\":\"10.1109/ICIASE45644.2019.9074041\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In community question and answering (Q&A) systems, due to the diversity of words and syntactic structure, matching question pairs representing similar meaning is a challenging task. A novel model based on dual-layer attention mechanism for semantic matching is proposed in this work. Firstly, an attention-based preprocessing method is used on the word representation layer to reduce redundant information. Secondly, a bilateral multiple perspectives attention mechanism is utilized on the context representation layer to obtain more interactive information. Finally, the obtained information is passed through a Bi-directional Long Short Term Memory Network (BiLSTM). Then the obtained final time steps of the two sequences are combined for prediction. The experimental results show that the accuracy of the proposed model in our self-defined Chinese dataset is up to 95.54% and also 88.91% with Quora dataset. It outperforms the existing advanced benchmark models. The model also provides stability and scalability to natural language inference tasks with the accuracy of 87.4% in the Stanford Natural Language Inference (SNLI) dataset.\",\"PeriodicalId\":206741,\"journal\":{\"name\":\"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)\",\"volume\":\"45 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIASE45644.2019.9074041\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Conference of Intelligent Applied Systems on Engineering (ICIASE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIASE45644.2019.9074041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

在社区问答系统中,由于词汇和句法结构的多样性,匹配具有相似意义的问题对是一项具有挑战性的任务。本文提出了一种基于双层注意机制的语义匹配模型。首先,在单词表示层采用基于注意的预处理方法,减少冗余信息;其次,在上下文表示层采用双边多角度关注机制,获得更多的交互性信息;最后,获得的信息通过双向长短期记忆网络(BiLSTM)传递。然后将得到的两个序列的最终时间步长组合起来进行预测。实验结果表明,该模型在自定义中文数据集上的准确率可达95.54%,在Quora数据集上的准确率可达88.91%。它优于现有的先进基准模型。该模型还为斯坦福自然语言推理(SNLI)数据集的自然语言推理任务提供了稳定性和可扩展性,准确率达到87.4%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A model based on dual-layer attention mechanism for semantic matching
In community question and answering (Q&A) systems, due to the diversity of words and syntactic structure, matching question pairs representing similar meaning is a challenging task. A novel model based on dual-layer attention mechanism for semantic matching is proposed in this work. Firstly, an attention-based preprocessing method is used on the word representation layer to reduce redundant information. Secondly, a bilateral multiple perspectives attention mechanism is utilized on the context representation layer to obtain more interactive information. Finally, the obtained information is passed through a Bi-directional Long Short Term Memory Network (BiLSTM). Then the obtained final time steps of the two sequences are combined for prediction. The experimental results show that the accuracy of the proposed model in our self-defined Chinese dataset is up to 95.54% and also 88.91% with Quora dataset. It outperforms the existing advanced benchmark models. The model also provides stability and scalability to natural language inference tasks with the accuracy of 87.4% in the Stanford Natural Language Inference (SNLI) dataset.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信