{"title":"A Unigram Orientation Model for Statistical Machine Translation","authors":"C. Tillmann","doi":"10.3115/1613984.1614010","DOIUrl":null,"url":null,"abstract":"In this paper, we present a unigram segmentation model for statistical machine translation where the segmentation units are blocks: pairs of phrases without internal structure. The segmentation model uses a novel orientation component to handle swapping of neighbor blocks. During training, we collect block unigram counts with orientation: we count how often a block occurs to the left or to the right of some predecessor block. The orientation model is shown to improve translation performance over two models: 1) no block re-ordering is used, and 2) the block swapping is controlled only by a language model. We show experimental results on a standard Arabic-English translation task.","PeriodicalId":252736,"journal":{"name":"Proceedings of HLT-NAACL 2004: Short Papers on XX - HLT-NAACL '04","volume":"93 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"344","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of HLT-NAACL 2004: Short Papers on XX - HLT-NAACL '04","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3115/1613984.1614010","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 344
Abstract
In this paper, we present a unigram segmentation model for statistical machine translation where the segmentation units are blocks: pairs of phrases without internal structure. The segmentation model uses a novel orientation component to handle swapping of neighbor blocks. During training, we collect block unigram counts with orientation: we count how often a block occurs to the left or to the right of some predecessor block. The orientation model is shown to improve translation performance over two models: 1) no block re-ordering is used, and 2) the block swapping is controlled only by a language model. We show experimental results on a standard Arabic-English translation task.