Hong Yang, Xiaokai Meng, Hua Yu, Yang Bai, Yu Han, Yongxin Liu
{"title":"利用语言模型和自学提示嵌入电气设备知识图谱","authors":"Hong Yang, Xiaokai Meng, Hua Yu, Yang Bai, Yu Han, Yongxin Liu","doi":"10.1142/s0129156424400500","DOIUrl":null,"url":null,"abstract":"In recent years, knowledge graphs have had a significant impact across diverse domains. Notably, the power knowledge graph has garnered considerable attention as high-performance database. However, its untapped reasoning capabilities offer an enticing avenue for exploration. One of the main reasons is the sparsity of power grid datasets, especially electrical equipment knowledge graph. Because of the scarcity of high-risk records, there exists a large number of long tail entities and long tail relations. To address this challenge, we introduce a novel text-based model called GELMSP (Graph Embedding using Language Model with Self-learned Prompts). We employ a bi-encoder structure along with a contrastive learning strategy to expedite the training process. Additionally, our approach incorporates a self-learned prompt mechanism that generates prompts for specific situations without the need for any additional information, known as self-learning. This harnesses the power of pre-trained language models to comprehend the semantic nuances within the entities and relationships of the knowledge graph. Adopting this innovative method enables our model to effectively handle sparse datasets, leading to a comprehensive understanding of interconnectedness within the knowledge graph. Additionally, we demonstrate the efficacy of our model through extensive experiments and comparisons against baseline methods, reaffirming its potential in advancing the state-of-the-art in electrical equipment defect diagnosis.","PeriodicalId":35778,"journal":{"name":"International Journal of High Speed Electronics and Systems","volume":"80 19","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Electrical Equipment Knowledge Graph Embedding Using Language Model with Self-learned Prompts\",\"authors\":\"Hong Yang, Xiaokai Meng, Hua Yu, Yang Bai, Yu Han, Yongxin Liu\",\"doi\":\"10.1142/s0129156424400500\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, knowledge graphs have had a significant impact across diverse domains. Notably, the power knowledge graph has garnered considerable attention as high-performance database. However, its untapped reasoning capabilities offer an enticing avenue for exploration. One of the main reasons is the sparsity of power grid datasets, especially electrical equipment knowledge graph. Because of the scarcity of high-risk records, there exists a large number of long tail entities and long tail relations. To address this challenge, we introduce a novel text-based model called GELMSP (Graph Embedding using Language Model with Self-learned Prompts). We employ a bi-encoder structure along with a contrastive learning strategy to expedite the training process. Additionally, our approach incorporates a self-learned prompt mechanism that generates prompts for specific situations without the need for any additional information, known as self-learning. This harnesses the power of pre-trained language models to comprehend the semantic nuances within the entities and relationships of the knowledge graph. Adopting this innovative method enables our model to effectively handle sparse datasets, leading to a comprehensive understanding of interconnectedness within the knowledge graph. Additionally, we demonstrate the efficacy of our model through extensive experiments and comparisons against baseline methods, reaffirming its potential in advancing the state-of-the-art in electrical equipment defect diagnosis.\",\"PeriodicalId\":35778,\"journal\":{\"name\":\"International Journal of High Speed Electronics and Systems\",\"volume\":\"80 19\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of High Speed Electronics and Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1142/s0129156424400500\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"Engineering\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of High Speed Electronics and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s0129156424400500","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Engineering","Score":null,"Total":0}
Electrical Equipment Knowledge Graph Embedding Using Language Model with Self-learned Prompts
In recent years, knowledge graphs have had a significant impact across diverse domains. Notably, the power knowledge graph has garnered considerable attention as high-performance database. However, its untapped reasoning capabilities offer an enticing avenue for exploration. One of the main reasons is the sparsity of power grid datasets, especially electrical equipment knowledge graph. Because of the scarcity of high-risk records, there exists a large number of long tail entities and long tail relations. To address this challenge, we introduce a novel text-based model called GELMSP (Graph Embedding using Language Model with Self-learned Prompts). We employ a bi-encoder structure along with a contrastive learning strategy to expedite the training process. Additionally, our approach incorporates a self-learned prompt mechanism that generates prompts for specific situations without the need for any additional information, known as self-learning. This harnesses the power of pre-trained language models to comprehend the semantic nuances within the entities and relationships of the knowledge graph. Adopting this innovative method enables our model to effectively handle sparse datasets, leading to a comprehensive understanding of interconnectedness within the knowledge graph. Additionally, we demonstrate the efficacy of our model through extensive experiments and comparisons against baseline methods, reaffirming its potential in advancing the state-of-the-art in electrical equipment defect diagnosis.
期刊介绍:
Launched in 1990, the International Journal of High Speed Electronics and Systems (IJHSES) has served graduate students and those in R&D, managerial and marketing positions by giving state-of-the-art data, and the latest research trends. Its main charter is to promote engineering education by advancing interdisciplinary science between electronics and systems and to explore high speed technology in photonics and electronics. IJHSES, a quarterly journal, continues to feature a broad coverage of topics relating to high speed or high performance devices, circuits and systems.