Distributed Training for Multilingual Combined Tokenizer using Deep Learning Model and Simple Communication Protocol

Christian Nathaniel Purwanto, Ary Hermawan, Joan Santoso, Gunawan
{"title":"Distributed Training for Multilingual Combined Tokenizer using Deep Learning Model and Simple Communication Protocol","authors":"Christian Nathaniel Purwanto, Ary Hermawan, Joan Santoso, Gunawan","doi":"10.1109/ICORIS.2019.8874898","DOIUrl":null,"url":null,"abstract":"In the big data era, text processing tends to be harder as the data increase. There is also the growth of deep learning model for solving natural language processing tasks without a need for hand-crafted rules. In this research, we provide two big solutions in the area of text preprocessing and distributed training for any neural-based model. We try to solve the most common text preprocessing which are word and sentence tokenization. Our proposed combined tokenizer is compared by using a single language model and multilanguage model. We also provide a simple communication using MQTT protocol to help the training distribution.","PeriodicalId":118443,"journal":{"name":"2019 1st International Conference on Cybernetics and Intelligent System (ICORIS)","volume":"73 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 1st International Conference on Cybernetics and Intelligent System (ICORIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORIS.2019.8874898","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In the big data era, text processing tends to be harder as the data increase. There is also the growth of deep learning model for solving natural language processing tasks without a need for hand-crafted rules. In this research, we provide two big solutions in the area of text preprocessing and distributed training for any neural-based model. We try to solve the most common text preprocessing which are word and sentence tokenization. Our proposed combined tokenizer is compared by using a single language model and multilanguage model. We also provide a simple communication using MQTT protocol to help the training distribution.
基于深度学习模型和简单通信协议的多语言组合标记器分布式训练
在大数据时代,随着数据的增加,文本处理变得越来越困难。深度学习模型也在增长,它可以在不需要手工规则的情况下解决自然语言处理任务。在本研究中,我们为任何基于神经的模型提供了文本预处理和分布式训练两大解决方案。我们试图解决最常见的文本预处理问题,即单词和句子的标记化。通过使用单语言模型和多语言模型对我们提出的组合标记器进行了比较。我们还提供了一个使用MQTT协议的简单通信,以帮助培训分发。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信