{"title":"Comparing Machine Translation Accuracy of Attention Models","authors":"Dat Pham Tuan, Duy Pham Ngoc","doi":"10.1109/NICS51282.2020.9335916","DOIUrl":null,"url":null,"abstract":"Machine translation models using encoder and decoder architecture do not give accuracy as high as expectation. One reason for this ineffectiveness is due to lack of attention mechanism during training phase. Attention-based models overcome drawbacks of previous ones and obtain noteworthy improvement in terms of accuracy. In the paper, we experiment three attention models and evaluate their BLEU scores on small data sets. Bahdanau model achieves high accuracy, Transformer model obtains good accuracy while Luong model only gets acceptable accuracy.","PeriodicalId":308944,"journal":{"name":"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)","volume":"95 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 7th NAFOSTED Conference on Information and Computer Science (NICS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NICS51282.2020.9335916","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Machine translation models using encoder and decoder architecture do not give accuracy as high as expectation. One reason for this ineffectiveness is due to lack of attention mechanism during training phase. Attention-based models overcome drawbacks of previous ones and obtain noteworthy improvement in terms of accuracy. In the paper, we experiment three attention models and evaluate their BLEU scores on small data sets. Bahdanau model achieves high accuracy, Transformer model obtains good accuracy while Luong model only gets acceptable accuracy.