{"title":"An efficient framework for sentence similarity inspired by quantum computing","authors":"Yan Yu, Dong Qiu, Ruiteng Yan","doi":"10.1109/ICKG52313.2021.00030","DOIUrl":null,"url":null,"abstract":"Accurately extracting the semantic information and the syntactic structure of sentences is important in natural language processing. Existing methods mainly combine the dependency tree to deep learning with complex computation time to achieve enough semantic information. It is essential to obtain sufficient semantic information and syntactic structures without any prior knowledge excepting word2vec. This paper proposes a model on sentence representation inspired by quantum entanglement using the tensor product to entangle both two consecutive notional words and words with depen-dencies. Inspired by quantum entanglement coefficients, we construct two different entanglement coefficients to weight the different semantic contributions of words with different relations. Finally, the proposed model is applied to SICK_train to verify their performances. The experimental results show that the provided methods achieve perfect results.","PeriodicalId":174126,"journal":{"name":"2021 IEEE International Conference on Big Knowledge (ICBK)","volume":"227 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Conference on Big Knowledge (ICBK)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICKG52313.2021.00030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Accurately extracting the semantic information and the syntactic structure of sentences is important in natural language processing. Existing methods mainly combine the dependency tree to deep learning with complex computation time to achieve enough semantic information. It is essential to obtain sufficient semantic information and syntactic structures without any prior knowledge excepting word2vec. This paper proposes a model on sentence representation inspired by quantum entanglement using the tensor product to entangle both two consecutive notional words and words with depen-dencies. Inspired by quantum entanglement coefficients, we construct two different entanglement coefficients to weight the different semantic contributions of words with different relations. Finally, the proposed model is applied to SICK_train to verify their performances. The experimental results show that the provided methods achieve perfect results.