Metarelation2vec:一种适用于异构网络的无元路径可伸缩表示学习模型

IF 5.2 1区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Lei Chen;Yuan Li;Yong Lei;Xingye Deng
{"title":"Metarelation2vec:一种适用于异构网络的无元路径可伸缩表示学习模型","authors":"Lei Chen;Yuan Li;Yong Lei;Xingye Deng","doi":"10.26599/TST.2023.9010044","DOIUrl":null,"url":null,"abstract":"Metapaths with specific complex semantics are critical to learning diverse semantic and structural information of heterogeneous networks (HNs) for most of the existing representation learning models. However, any metapaths consisting of multiple, simple metarelations must be driven by domain experts. These sensitive, expensive, and limited metapaths severely reduce the flexibility and scalability of the existing models. A metapath-free, scalable representation learning model, called Metarelation2vec, is proposed for HNs with biased joint learning of all metarelations in a bid to address this problem. Specifically, a metarelation-aware, biased walk strategy is first designed to obtain better training samples by using autogenerating cooperation probabilities for all metarelations rather than using expert-given metapaths. Thereafter, grouped nodes by the type, a common and shallow skip-gram model is used to separately learn structural proximity for each node type. Next, grouped links by the type, a novel and shallow model is used to separately learn the semantic proximity for each link type. Finally, supervised by the cooperation probabilities of all meta-words, the biased training samples are thrown into the shallow models to jointly learn the structural and semantic information in the HNs, ensuring the accuracy and scalability of the models. Extensive experimental results on three tasks and four open datasets demonstrate the advantages of our proposed model.","PeriodicalId":60306,"journal":{"name":"Tsinghua Science and Technology","volume":"29 2","pages":"553-575"},"PeriodicalIF":5.2000,"publicationDate":"2023-09-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/iel7/5971803/10258149/10258166.pdf","citationCount":"0","resultStr":"{\"title\":\"Metarelation2vec: A Metapath-Free Scalable Representation Learning Model for Heterogeneous Networks\",\"authors\":\"Lei Chen;Yuan Li;Yong Lei;Xingye Deng\",\"doi\":\"10.26599/TST.2023.9010044\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Metapaths with specific complex semantics are critical to learning diverse semantic and structural information of heterogeneous networks (HNs) for most of the existing representation learning models. However, any metapaths consisting of multiple, simple metarelations must be driven by domain experts. These sensitive, expensive, and limited metapaths severely reduce the flexibility and scalability of the existing models. A metapath-free, scalable representation learning model, called Metarelation2vec, is proposed for HNs with biased joint learning of all metarelations in a bid to address this problem. Specifically, a metarelation-aware, biased walk strategy is first designed to obtain better training samples by using autogenerating cooperation probabilities for all metarelations rather than using expert-given metapaths. Thereafter, grouped nodes by the type, a common and shallow skip-gram model is used to separately learn structural proximity for each node type. Next, grouped links by the type, a novel and shallow model is used to separately learn the semantic proximity for each link type. Finally, supervised by the cooperation probabilities of all meta-words, the biased training samples are thrown into the shallow models to jointly learn the structural and semantic information in the HNs, ensuring the accuracy and scalability of the models. Extensive experimental results on three tasks and four open datasets demonstrate the advantages of our proposed model.\",\"PeriodicalId\":60306,\"journal\":{\"name\":\"Tsinghua Science and Technology\",\"volume\":\"29 2\",\"pages\":\"553-575\"},\"PeriodicalIF\":5.2000,\"publicationDate\":\"2023-09-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/iel7/5971803/10258149/10258166.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Tsinghua Science and Technology\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10258166/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10258166/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

对于大多数现有的表示学习模型,具有特定复杂语义的元路径对于学习异构网络的不同语义和结构信息至关重要。然而,任何由多个简单元关系组成的元路径都必须由领域专家驱动。这些敏感、昂贵且有限的元路径严重降低了现有模型的灵活性和可扩展性。为了解决这个问题,针对所有元关系的有偏联合学习的HN,提出了一种无元路径、可扩展的表示学习模型,称为Metarelation2vec。具体而言,首先设计了一种元关系感知、有偏差的行走策略,通过使用所有元关系的自动生成合作概率,而不是使用专家给定的元路径,来获得更好的训练样本。然后,根据类型对节点进行分组,使用常见的浅跳图模型来分别学习每个节点类型的结构接近度。接下来,根据类型对链接进行分组,使用一个新颖的浅层模型来分别学习每个链接类型的语义接近度。最后,在所有元词的合作概率的监督下,将有偏差的训练样本放入浅层模型中,共同学习HN中的结构和语义信息,确保了模型的准确性和可扩展性。在三个任务和四个开放数据集上的大量实验结果证明了我们提出的模型的优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Metarelation2vec: A Metapath-Free Scalable Representation Learning Model for Heterogeneous Networks
Metapaths with specific complex semantics are critical to learning diverse semantic and structural information of heterogeneous networks (HNs) for most of the existing representation learning models. However, any metapaths consisting of multiple, simple metarelations must be driven by domain experts. These sensitive, expensive, and limited metapaths severely reduce the flexibility and scalability of the existing models. A metapath-free, scalable representation learning model, called Metarelation2vec, is proposed for HNs with biased joint learning of all metarelations in a bid to address this problem. Specifically, a metarelation-aware, biased walk strategy is first designed to obtain better training samples by using autogenerating cooperation probabilities for all metarelations rather than using expert-given metapaths. Thereafter, grouped nodes by the type, a common and shallow skip-gram model is used to separately learn structural proximity for each node type. Next, grouped links by the type, a novel and shallow model is used to separately learn the semantic proximity for each link type. Finally, supervised by the cooperation probabilities of all meta-words, the biased training samples are thrown into the shallow models to jointly learn the structural and semantic information in the HNs, ensuring the accuracy and scalability of the models. Extensive experimental results on three tasks and four open datasets demonstrate the advantages of our proposed model.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
12.10
自引率
0.00%
发文量
2340
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信