{"title":"引导变压器生成用于AMR解析的图结构","authors":"Runliang Niu, Qi Wang","doi":"10.1117/12.2639102","DOIUrl":null,"url":null,"abstract":"Abstract Meaning Representation (AMR) is a kind of semantic representation of natural language, which aims to represent the semantics of a sentence by a rooted, directed, and acyclic graph (DAG). Most existing AMR parsing works are designed under specific dictionary. However, these works make the content length of each node limited, and they mainly need to go through a very complicated post-processing process. In this paper, we propose a novel encoder-decoder framework for AMR parsing to address these issues, which generates a graph structure and predicts node relationships simultaneously. Specifically, we represent each node as a five-tuple form, containing token sequence of variable length and the connection relationship with other nodes. BERT model is employed as the encoder module. Our decoder module first generates a linearization representation of the graph structure, then predicts multiple elements of each node by four different attention based classifiers. We also found an effective way to improve the generalization performance of Transformer model for graph generation. By assigning different index number to nodes in each training step and remove positional encoding used in most generative models, the model can learn the relationship between nodes better. Experiments against two AMR datasets demonstrate the competitive performance of our proposed method compared with baseline methods.","PeriodicalId":336892,"journal":{"name":"Neural Networks, Information and Communication Engineering","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Guiding transformer to generate graph structure for AMR parsing\",\"authors\":\"Runliang Niu, Qi Wang\",\"doi\":\"10.1117/12.2639102\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract Meaning Representation (AMR) is a kind of semantic representation of natural language, which aims to represent the semantics of a sentence by a rooted, directed, and acyclic graph (DAG). Most existing AMR parsing works are designed under specific dictionary. However, these works make the content length of each node limited, and they mainly need to go through a very complicated post-processing process. In this paper, we propose a novel encoder-decoder framework for AMR parsing to address these issues, which generates a graph structure and predicts node relationships simultaneously. Specifically, we represent each node as a five-tuple form, containing token sequence of variable length and the connection relationship with other nodes. BERT model is employed as the encoder module. Our decoder module first generates a linearization representation of the graph structure, then predicts multiple elements of each node by four different attention based classifiers. We also found an effective way to improve the generalization performance of Transformer model for graph generation. By assigning different index number to nodes in each training step and remove positional encoding used in most generative models, the model can learn the relationship between nodes better. Experiments against two AMR datasets demonstrate the competitive performance of our proposed method compared with baseline methods.\",\"PeriodicalId\":336892,\"journal\":{\"name\":\"Neural Networks, Information and Communication Engineering\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks, Information and Communication Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1117/12.2639102\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks, Information and Communication Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2639102","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Guiding transformer to generate graph structure for AMR parsing
Abstract Meaning Representation (AMR) is a kind of semantic representation of natural language, which aims to represent the semantics of a sentence by a rooted, directed, and acyclic graph (DAG). Most existing AMR parsing works are designed under specific dictionary. However, these works make the content length of each node limited, and they mainly need to go through a very complicated post-processing process. In this paper, we propose a novel encoder-decoder framework for AMR parsing to address these issues, which generates a graph structure and predicts node relationships simultaneously. Specifically, we represent each node as a five-tuple form, containing token sequence of variable length and the connection relationship with other nodes. BERT model is employed as the encoder module. Our decoder module first generates a linearization representation of the graph structure, then predicts multiple elements of each node by four different attention based classifiers. We also found an effective way to improve the generalization performance of Transformer model for graph generation. By assigning different index number to nodes in each training step and remove positional encoding used in most generative models, the model can learn the relationship between nodes better. Experiments against two AMR datasets demonstrate the competitive performance of our proposed method compared with baseline methods.