Inflectional Review of Deep Learning on Natural Language Processing

SK Ahammad Fahad, Abdulsamad E. Yahya
{"title":"Inflectional Review of Deep Learning on Natural Language Processing","authors":"SK Ahammad Fahad, Abdulsamad E. Yahya","doi":"10.1109/ICSCEE.2018.8538416","DOIUrl":null,"url":null,"abstract":"In the age of knowledge, Natural Language Processing (NLP) express its demand by a huge range of utilization. Previously NLP was dealing with statically data. Contemporary time NLP is doing considerably with the corpus, lexicon database, pattern reorganization. Considering Deep Learning (DL) method recognize artificial Neural Network (NN) to nonlinear process, NLP tools become increasingly accurate and efficient that begin a debacle. Multi-Layer Neural Network obtaining the importance of the NLP for its capability including standard speed and resolute output. Hierarchical designs of data operate recurring processing layers to learn and with this arrangement of DL methods manage several practices. In this paper, this resumed striving to reach a review of the tools and the necessary methodology to present a clear understanding of the association of NLP and DL for truly understand in the training. Efficiency and execution both are improved in NLP by Part of speech tagging (POST), Morphological Analysis, Named Entity Recognition (NER), Semantic Role Labeling (SRL), Syntactic Parsing, and Coreference resolution. Artificial Neural Networks (ANN), Time Delay Neural Networks (TDNN), Recurrent Neural Network (RNN), Convolution Neural Networks (CNN), and Long-Short-Term-Memory (LSTM) dealings among Dense Vector (DV), Windows Approach (WA), and Multitask learning (MTL) as a characteristic of Deep Learning. After statically methods, when DL communicate the influence of NLP, the individual form of the NLP process and DL rule collaboration was started a fundamental connection.","PeriodicalId":265737,"journal":{"name":"2018 International Conference on Smart Computing and Electronic Enterprise (ICSCEE)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"24","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 International Conference on Smart Computing and Electronic Enterprise (ICSCEE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSCEE.2018.8538416","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 24

Abstract

In the age of knowledge, Natural Language Processing (NLP) express its demand by a huge range of utilization. Previously NLP was dealing with statically data. Contemporary time NLP is doing considerably with the corpus, lexicon database, pattern reorganization. Considering Deep Learning (DL) method recognize artificial Neural Network (NN) to nonlinear process, NLP tools become increasingly accurate and efficient that begin a debacle. Multi-Layer Neural Network obtaining the importance of the NLP for its capability including standard speed and resolute output. Hierarchical designs of data operate recurring processing layers to learn and with this arrangement of DL methods manage several practices. In this paper, this resumed striving to reach a review of the tools and the necessary methodology to present a clear understanding of the association of NLP and DL for truly understand in the training. Efficiency and execution both are improved in NLP by Part of speech tagging (POST), Morphological Analysis, Named Entity Recognition (NER), Semantic Role Labeling (SRL), Syntactic Parsing, and Coreference resolution. Artificial Neural Networks (ANN), Time Delay Neural Networks (TDNN), Recurrent Neural Network (RNN), Convolution Neural Networks (CNN), and Long-Short-Term-Memory (LSTM) dealings among Dense Vector (DV), Windows Approach (WA), and Multitask learning (MTL) as a characteristic of Deep Learning. After statically methods, when DL communicate the influence of NLP, the individual form of the NLP process and DL rule collaboration was started a fundamental connection.
深度学习在自然语言处理中的研究进展
在知识时代,自然语言处理(NLP)以巨大的应用范围来表达其需求。以前,NLP处理的是静态数据。当代自然语言处理在语料库、词典数据库、模式重组等方面做得相当好。考虑到深度学习(Deep Learning, DL)方法将人工神经网络(NN)识别为非线性过程,NLP工具变得越来越准确和高效,从而开始崩溃。多层神经网络以其标准速度和坚决输出的能力获得了自然语言处理的重要性。数据的分层设计操作循环处理层来学习,并通过这种深度学习方法的安排来管理几个实践。在本文中,我们将继续努力对工具和必要的方法进行回顾,以清晰地理解NLP和DL之间的关系,从而在训练中真正理解它们。在NLP中,词性标注(POST)、形态分析、命名实体识别(NER)、语义角色标注(SRL)、句法解析和共指解析可以提高效率和执行力。人工神经网络(ANN)、时滞神经网络(TDNN)、循环神经网络(RNN)、卷积神经网络(CNN)和长短期记忆(LSTM)在密集向量(DV)、Windows方法(WA)和多任务学习(MTL)之间的处理作为深度学习的一个特征。在静态方法之后,当深度学习传播NLP的影响时,个体形式的NLP过程与深度学习规则协作开始了根本性的联系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信