{"title":"SemanticGraph2Vec: Semantic graph embedding for text representation","authors":"Wael Etaiwi, Arafat Awajan","doi":"10.1016/j.array.2023.100276","DOIUrl":null,"url":null,"abstract":"<div><p>Graph embedding is an important representational technique that aims to maintain the structure of a graph while learning low-dimensional representations of its vertices. Semantic relationships between vertices contain essential information regarding the meaning of the represented graph. However, most graph embedding methods do not consider the semantic relationships during the learning process. In this paper, we propose a novel semantic graph embedding approach, called SemanticGraph2Vec. SemanticGraph2Vec learns mappings of vertices into low-dimensional feature spaces that consider the most important semantic relationships between graph vertices. The proposed approach extends and enhances prior work based on a set of random walks of graph vertices by using semantic walks instead of random walks which provides more useful embeddings for text graphs. A set of experiments are conducted to evaluate the performance of SemanticGraph2Vec. SemanticGraph2Vec is employed on a part-of-speech tagging task. Experimental results demonstrate that SemanticGraph2Vec outperforms two state-of-the-art baselines methods in terms of precision and F1 score.</p></div>","PeriodicalId":8417,"journal":{"name":"Array","volume":null,"pages":null},"PeriodicalIF":2.3000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Array","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590005623000012","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 4
Abstract
Graph embedding is an important representational technique that aims to maintain the structure of a graph while learning low-dimensional representations of its vertices. Semantic relationships between vertices contain essential information regarding the meaning of the represented graph. However, most graph embedding methods do not consider the semantic relationships during the learning process. In this paper, we propose a novel semantic graph embedding approach, called SemanticGraph2Vec. SemanticGraph2Vec learns mappings of vertices into low-dimensional feature spaces that consider the most important semantic relationships between graph vertices. The proposed approach extends and enhances prior work based on a set of random walks of graph vertices by using semantic walks instead of random walks which provides more useful embeddings for text graphs. A set of experiments are conducted to evaluate the performance of SemanticGraph2Vec. SemanticGraph2Vec is employed on a part-of-speech tagging task. Experimental results demonstrate that SemanticGraph2Vec outperforms two state-of-the-art baselines methods in terms of precision and F1 score.