{"title":"利用图注意网络将AMR集成到神经机器翻译中","authors":"Long H. B. Nguyen, Viet H. Pham, D. Dinh","doi":"10.1109/NICS51282.2020.9335896","DOIUrl":null,"url":null,"abstract":"Semantic representation is potentially useful to enforce meaning preservation and improve generalization performance of machine translation methods. In this paper, we incorporate semantic information from Abstract Meaning Representation (AMR) semantic graphs into neural machine translation. First, we use Graph Attention Networks (GATs) to encode the AMR graphs into a vector space. Then, we propose an effective way to integrate the semantic representation to the attention-encoder-decoder translation model. The experimental results show the improvements in BLEU scores over the baseline method on the English-Vietnamese language pair.","PeriodicalId":308944,"journal":{"name":"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Integrating AMR to Neural Machine Translation using Graph Attention Networks\",\"authors\":\"Long H. B. Nguyen, Viet H. Pham, D. Dinh\",\"doi\":\"10.1109/NICS51282.2020.9335896\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Semantic representation is potentially useful to enforce meaning preservation and improve generalization performance of machine translation methods. In this paper, we incorporate semantic information from Abstract Meaning Representation (AMR) semantic graphs into neural machine translation. First, we use Graph Attention Networks (GATs) to encode the AMR graphs into a vector space. Then, we propose an effective way to integrate the semantic representation to the attention-encoder-decoder translation model. The experimental results show the improvements in BLEU scores over the baseline method on the English-Vietnamese language pair.\",\"PeriodicalId\":308944,\"journal\":{\"name\":\"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)\",\"volume\":\"59 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-11-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NICS51282.2020.9335896\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NICS51282.2020.9335896","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Integrating AMR to Neural Machine Translation using Graph Attention Networks
Semantic representation is potentially useful to enforce meaning preservation and improve generalization performance of machine translation methods. In this paper, we incorporate semantic information from Abstract Meaning Representation (AMR) semantic graphs into neural machine translation. First, we use Graph Attention Networks (GATs) to encode the AMR graphs into a vector space. Then, we propose an effective way to integrate the semantic representation to the attention-encoder-decoder translation model. The experimental results show the improvements in BLEU scores over the baseline method on the English-Vietnamese language pair.