Data Representation for Deep Learning - Based Arabic Text Summarization Performance Using Python Results

Mohamed Yassin Abdelwahab Yassin, Yazeed Al Moaiad
{"title":"Data Representation for Deep Learning - Based Arabic Text Summarization Performance Using Python Results","authors":"Mohamed Yassin Abdelwahab Yassin, Yazeed Al Moaiad","doi":"10.15379/ijmst.v11i1.3646","DOIUrl":null,"url":null,"abstract":"A sequence-to-sequence model is used as the foundation for a suggested abstractive Arabic text summarizing system. Our goal is to create a sequence-to-sequence model by utilizing multiple deep artificial neural networks and determining which one performs the best. The encoder and decoder have been developed using several layers of recurrent neural networks, gated recurrent units, recursive neural networks, convolutional neural networks, long short-term memory, and bidirectional long short-term memory. We are re-implementing the fundamental summarization model in this study, which uses the sequence-to-sequence framework. Using a Google Colab Jupiter notebook that runs smoothly, we have constructed these models using the Keras library. The results further demonstrate that one of the key techniques that has led to breakthrough performance with deep neural networks is the use of Gensim for word embeddings over other text representations by abstractive summarization models, along with FastText, a library for efficient learning of word representations and sentence classification.","PeriodicalId":301862,"journal":{"name":"International Journal of Membrane Science and Technology","volume":"50 227","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Membrane Science and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.15379/ijmst.v11i1.3646","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

A sequence-to-sequence model is used as the foundation for a suggested abstractive Arabic text summarizing system. Our goal is to create a sequence-to-sequence model by utilizing multiple deep artificial neural networks and determining which one performs the best. The encoder and decoder have been developed using several layers of recurrent neural networks, gated recurrent units, recursive neural networks, convolutional neural networks, long short-term memory, and bidirectional long short-term memory. We are re-implementing the fundamental summarization model in this study, which uses the sequence-to-sequence framework. Using a Google Colab Jupiter notebook that runs smoothly, we have constructed these models using the Keras library. The results further demonstrate that one of the key techniques that has led to breakthrough performance with deep neural networks is the use of Gensim for word embeddings over other text representations by abstractive summarization models, along with FastText, a library for efficient learning of word representations and sentence classification.
基于深度学习的数据表示--使用 Python 的阿拉伯语文本摘要性能结果
序列到序列模型是建议的抽象阿拉伯语文本摘要系统的基础。我们的目标是利用多个深度人工神经网络创建序列到序列模型,并确定哪一个性能最佳。编码器和解码器的开发使用了多层递归神经网络、门控递归单元、递归神经网络、卷积神经网络、长短期记忆和双向长短期记忆。在这项研究中,我们重新实施了基本的摘要模型,该模型使用序列到序列框架。利用运行流畅的 Google Colab Jupiter 笔记本,我们使用 Keras 库构建了这些模型。研究结果进一步证明,利用深度神经网络取得突破性性能的关键技术之一,是通过抽象化摘要模型使用 Gensim 对其他文本表征进行单词嵌入,同时使用 FastText(一种用于高效学习单词表征和句子分类的库)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信