Chanwook Min, Jinhyun Ahn, Taewhi Lee, Dong-Hyuk Im
{"title":"TK-BERT:基于主题的知识图语言表示的有效模型","authors":"Chanwook Min, Jinhyun Ahn, Taewhi Lee, Dong-Hyuk Im","doi":"10.1109/IMCOM56909.2023.10035573","DOIUrl":null,"url":null,"abstract":"Recently, the K-BERT model was proposed to add knowledge for language representation in specialized fields. The K-BERT model uses a knowledge graph to perform transfer learning on the pre-trained BERT model. However, the K-BERT model adds the knowledge that exists in the knowledge graph rather than the data relevant to the topic of the input data when using the knowledge graph of the corresponding field. Hence, the K-BERT model can cause confusion in the training. To solve this problem, this study proposes a topic-based knowledge graph BERT (TK-BERT) model, which uses the topic modeling technique. The TK-BERT model divides the knowledge graph by topic using the knowledge graph's topic model and infers the topic for the input sentence, adding only knowledge relevant to the topic. Therefore, the TK-BERT model does not add unnecessary knowledge to the knowledge graph. Moreover, the proposed TK-BERT model outperforms the K-BERT model.","PeriodicalId":230213,"journal":{"name":"2023 17th International Conference on Ubiquitous Information Management and Communication (IMCOM)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"TK-BERT: Effective Model of Language Representation using Topic-based Knowledge Graphs\",\"authors\":\"Chanwook Min, Jinhyun Ahn, Taewhi Lee, Dong-Hyuk Im\",\"doi\":\"10.1109/IMCOM56909.2023.10035573\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recently, the K-BERT model was proposed to add knowledge for language representation in specialized fields. The K-BERT model uses a knowledge graph to perform transfer learning on the pre-trained BERT model. However, the K-BERT model adds the knowledge that exists in the knowledge graph rather than the data relevant to the topic of the input data when using the knowledge graph of the corresponding field. Hence, the K-BERT model can cause confusion in the training. To solve this problem, this study proposes a topic-based knowledge graph BERT (TK-BERT) model, which uses the topic modeling technique. The TK-BERT model divides the knowledge graph by topic using the knowledge graph's topic model and infers the topic for the input sentence, adding only knowledge relevant to the topic. Therefore, the TK-BERT model does not add unnecessary knowledge to the knowledge graph. Moreover, the proposed TK-BERT model outperforms the K-BERT model.\",\"PeriodicalId\":230213,\"journal\":{\"name\":\"2023 17th International Conference on Ubiquitous Information Management and Communication (IMCOM)\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 17th International Conference on Ubiquitous Information Management and Communication (IMCOM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IMCOM56909.2023.10035573\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 17th International Conference on Ubiquitous Information Management and Communication (IMCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IMCOM56909.2023.10035573","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
TK-BERT: Effective Model of Language Representation using Topic-based Knowledge Graphs
Recently, the K-BERT model was proposed to add knowledge for language representation in specialized fields. The K-BERT model uses a knowledge graph to perform transfer learning on the pre-trained BERT model. However, the K-BERT model adds the knowledge that exists in the knowledge graph rather than the data relevant to the topic of the input data when using the knowledge graph of the corresponding field. Hence, the K-BERT model can cause confusion in the training. To solve this problem, this study proposes a topic-based knowledge graph BERT (TK-BERT) model, which uses the topic modeling technique. The TK-BERT model divides the knowledge graph by topic using the knowledge graph's topic model and infers the topic for the input sentence, adding only knowledge relevant to the topic. Therefore, the TK-BERT model does not add unnecessary knowledge to the knowledge graph. Moreover, the proposed TK-BERT model outperforms the K-BERT model.