{"title":"基于量子纠缠的潜在语义关联挖掘","authors":"Zan Li, Yuexian Hou, Tingsan Pan, Tian Tian, Yingjie Gao","doi":"10.1145/3507548.3507598","DOIUrl":null,"url":null,"abstract":"Text representation learning is the cornerstone of solving downstream problems in Natural Language Processing (NLP). However, mining the potential explanatory factors or semantic associations behind data, rather than simply representing the superficial co-occurrence of words, remains a non-trivial challenge. To this end, we seek inspiration from the Quantum Entanglement (QE) which can effectively provide a complete description for the nature of realities and a globally-determined intrinsic correlation of considered objects, thus proposing a novel representation learning hypothesis called the Latent Semantic Correlation (LSC), namely the implicit internal coherence between the semantic space and its corresponding category space. To construct a multi-granularity representation from sememes to words, phrases, sentences, and higher-level LSC, we implement a QE-inspired Network (QEN) under the constraints of quantum formalism and propose the Local Semantic Measurement (LSM) and Extraction (LSE) for effectively capturing probability distribution information from the entangled state of a bipartite quantum system, which has a clear geometrical motivation but also supports a well-founded probabilistic interpretation. Experimental results conducted on several benchmarking classification tasks prove the validity of the LSC hypothesis and the superiority of QEN.","PeriodicalId":414908,"journal":{"name":"Proceedings of the 2021 5th International Conference on Computer Science and Artificial Intelligence","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Mining Latent Semantic Correlation inspired by Quantum Entanglement\",\"authors\":\"Zan Li, Yuexian Hou, Tingsan Pan, Tian Tian, Yingjie Gao\",\"doi\":\"10.1145/3507548.3507598\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Text representation learning is the cornerstone of solving downstream problems in Natural Language Processing (NLP). However, mining the potential explanatory factors or semantic associations behind data, rather than simply representing the superficial co-occurrence of words, remains a non-trivial challenge. To this end, we seek inspiration from the Quantum Entanglement (QE) which can effectively provide a complete description for the nature of realities and a globally-determined intrinsic correlation of considered objects, thus proposing a novel representation learning hypothesis called the Latent Semantic Correlation (LSC), namely the implicit internal coherence between the semantic space and its corresponding category space. To construct a multi-granularity representation from sememes to words, phrases, sentences, and higher-level LSC, we implement a QE-inspired Network (QEN) under the constraints of quantum formalism and propose the Local Semantic Measurement (LSM) and Extraction (LSE) for effectively capturing probability distribution information from the entangled state of a bipartite quantum system, which has a clear geometrical motivation but also supports a well-founded probabilistic interpretation. Experimental results conducted on several benchmarking classification tasks prove the validity of the LSC hypothesis and the superiority of QEN.\",\"PeriodicalId\":414908,\"journal\":{\"name\":\"Proceedings of the 2021 5th International Conference on Computer Science and Artificial Intelligence\",\"volume\":\"15 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2021 5th International Conference on Computer Science and Artificial Intelligence\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3507548.3507598\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 5th International Conference on Computer Science and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3507548.3507598","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Mining Latent Semantic Correlation inspired by Quantum Entanglement
Text representation learning is the cornerstone of solving downstream problems in Natural Language Processing (NLP). However, mining the potential explanatory factors or semantic associations behind data, rather than simply representing the superficial co-occurrence of words, remains a non-trivial challenge. To this end, we seek inspiration from the Quantum Entanglement (QE) which can effectively provide a complete description for the nature of realities and a globally-determined intrinsic correlation of considered objects, thus proposing a novel representation learning hypothesis called the Latent Semantic Correlation (LSC), namely the implicit internal coherence between the semantic space and its corresponding category space. To construct a multi-granularity representation from sememes to words, phrases, sentences, and higher-level LSC, we implement a QE-inspired Network (QEN) under the constraints of quantum formalism and propose the Local Semantic Measurement (LSM) and Extraction (LSE) for effectively capturing probability distribution information from the entangled state of a bipartite quantum system, which has a clear geometrical motivation but also supports a well-founded probabilistic interpretation. Experimental results conducted on several benchmarking classification tasks prove the validity of the LSC hypothesis and the superiority of QEN.