Machine Comprehension Comparison using Latest Advancements in Deep Learning

Aryan Kumar Singh, Kapil Gyanchandani, Pramod Kumar Singh, J. Prakash
{"title":"Machine Comprehension Comparison using Latest Advancements in Deep Learning","authors":"Aryan Kumar Singh, Kapil Gyanchandani, Pramod Kumar Singh, J. Prakash","doi":"10.1109/INDICON52576.2021.9691737","DOIUrl":null,"url":null,"abstract":"Machine Comprehension or Question Answering (QA) is one of the most challenging natural language processing tasks due to the language’s dynamic nature and understanding the context of the question. In this paper, we propose a similarity attention layer with an aim to reduce human labor by automating tedious QA tasks using the attention mechanism in deep learning model; it uses attention scores and obtains good results even without pre-training. The QA using attention has immense scope in search engine optimization, page ranking, and chatbots. The traditional rule-based models and statistical methods underperform due to variations in the language. This dynamic nature of the language is well captured by the nonlinear learning of the neural networks. The conventional encoder-decoder architecture of neural networks for QA works well in the case of short sentences. However, the performance comes down for paragraphs and very long sentences as it is difficult for the network to memorize the super-long sentences. In contrast, the attention model helps the network focus on smaller attention areas in the complex input paragraph, part by part, until the entire text is processed. The results are very promising; our (single) model outperforms the existing ensemble method too.","PeriodicalId":106004,"journal":{"name":"2021 IEEE 18th India Council International Conference (INDICON)","volume":"102 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 18th India Council International Conference (INDICON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INDICON52576.2021.9691737","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Machine Comprehension or Question Answering (QA) is one of the most challenging natural language processing tasks due to the language’s dynamic nature and understanding the context of the question. In this paper, we propose a similarity attention layer with an aim to reduce human labor by automating tedious QA tasks using the attention mechanism in deep learning model; it uses attention scores and obtains good results even without pre-training. The QA using attention has immense scope in search engine optimization, page ranking, and chatbots. The traditional rule-based models and statistical methods underperform due to variations in the language. This dynamic nature of the language is well captured by the nonlinear learning of the neural networks. The conventional encoder-decoder architecture of neural networks for QA works well in the case of short sentences. However, the performance comes down for paragraphs and very long sentences as it is difficult for the network to memorize the super-long sentences. In contrast, the attention model helps the network focus on smaller attention areas in the complex input paragraph, part by part, until the entire text is processed. The results are very promising; our (single) model outperforms the existing ensemble method too.
使用深度学习最新进展的机器理解比较
由于语言的动态性和对问题上下文的理解,机器理解或问答(QA)是最具挑战性的自然语言处理任务之一。在本文中,我们提出了一个相似注意层,目的是通过使用深度学习模型中的注意机制自动化繁琐的QA任务来减少人力劳动;它使用注意力得分,即使没有预先训练也能获得很好的结果。使用注意力的QA在搜索引擎优化、页面排名和聊天机器人中具有巨大的应用范围。传统的基于规则的模型和统计方法由于语言的变化而表现不佳。神经网络的非线性学习很好地捕捉到了语言的这种动态特性。传统的QA神经网络的编码器-解码器结构在短句子的情况下效果很好。然而,对于段落和很长的句子,由于网络很难记住超长句子,因此性能下降。相比之下,注意模型帮助网络将注意力集中在复杂输入段落中较小的注意区域,一部分一部分地,直到整个文本被处理。结果非常有希望;我们的(单一)模型也优于现有的集成方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信