{"title":"带注意的递归神经网络英汉互译","authors":"Kriti Nemkul, S. Shakya","doi":"10.1109/ICCCIS51004.2021.9397185","DOIUrl":null,"url":null,"abstract":"Machine Translation, an automated system that intakes the text from the source language as an input, applies some computation on that input and gives the equivalent text in the target language without any human involvement. This research work focuses on developing the models for English to Nepali sentence translation incorporating Gated Recurrent Unit (GRU) and Long Short Term Memory (LSTM) with attention. Bilingual Evaluation Understudy (BLEU) Score is calculated to evaluate the efficiency of the model. Different parameters has been used to test the model. The model has been tested with neural network layer 2 and 4 and with hidden units 128, 256 and 512. The GRU cells in encoder and decoder with attention with 2 layer of neural network and 512 hidden units appears to be better in translating the English sentences into Nepali sentences with highest BLEU score 12.3.","PeriodicalId":316752,"journal":{"name":"2021 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS)","volume":"123 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"English to Nepali Sentence Translation Using Recurrent Neural Network with Attention\",\"authors\":\"Kriti Nemkul, S. Shakya\",\"doi\":\"10.1109/ICCCIS51004.2021.9397185\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Machine Translation, an automated system that intakes the text from the source language as an input, applies some computation on that input and gives the equivalent text in the target language without any human involvement. This research work focuses on developing the models for English to Nepali sentence translation incorporating Gated Recurrent Unit (GRU) and Long Short Term Memory (LSTM) with attention. Bilingual Evaluation Understudy (BLEU) Score is calculated to evaluate the efficiency of the model. Different parameters has been used to test the model. The model has been tested with neural network layer 2 and 4 and with hidden units 128, 256 and 512. The GRU cells in encoder and decoder with attention with 2 layer of neural network and 512 hidden units appears to be better in translating the English sentences into Nepali sentences with highest BLEU score 12.3.\",\"PeriodicalId\":316752,\"journal\":{\"name\":\"2021 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS)\",\"volume\":\"123 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-02-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCCIS51004.2021.9397185\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCCIS51004.2021.9397185","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
English to Nepali Sentence Translation Using Recurrent Neural Network with Attention
Machine Translation, an automated system that intakes the text from the source language as an input, applies some computation on that input and gives the equivalent text in the target language without any human involvement. This research work focuses on developing the models for English to Nepali sentence translation incorporating Gated Recurrent Unit (GRU) and Long Short Term Memory (LSTM) with attention. Bilingual Evaluation Understudy (BLEU) Score is calculated to evaluate the efficiency of the model. Different parameters has been used to test the model. The model has been tested with neural network layer 2 and 4 and with hidden units 128, 256 and 512. The GRU cells in encoder and decoder with attention with 2 layer of neural network and 512 hidden units appears to be better in translating the English sentences into Nepali sentences with highest BLEU score 12.3.