{"title":"基于双向长短期记忆和指针生成器模式的抽象文本摘要方法","authors":"Saroj Anand Tripathy, S. Ashok","doi":"10.22201/icat.24486736e.2023.21.1.1446","DOIUrl":null,"url":null,"abstract":"With the rise of the Internet, we now have a lot of information at our disposal. We 're swamped from many sources — news, social media, to name a few, office emails. This paper addresses the problem of reading through such extensive information by summarizing it using text summarizer based on Abstractive Summarization using deep learning models, i.e. using bidirectional Long Short-Term Memory (LSTM) networks and Pointer Generator mode. The LSTM model (which is a modification of the Recurrent Neural Network) is trained and tested on the Amazon Fine Food Review dataset using the Bahadau Attention Model Decoder with the use of Conceptnet Numberbatch embeddings that are very similar and better to GloVe. Pointer Generator mode is trained and tested by the CNN / Daily Mail dataset and the model uses both Decoder and Attention inputs. But due 2 major problems in LSTM model like the inability of the network to copy facts and repetition of words the second method is, i.e., Pointer Generator mode is used. This paper in turn aims to provide an analysis on both the models to provide a better understanding of the working of the models to enable to create a strong text summarizer. The main purpose is to provide reliable summaries of datasets or uploaded files, depending on the choice of the user. Unnecessary sentences will be rejected in order to obtain the most important sentences.","PeriodicalId":15073,"journal":{"name":"Journal of Applied Research and Technology","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode\",\"authors\":\"Saroj Anand Tripathy, S. Ashok\",\"doi\":\"10.22201/icat.24486736e.2023.21.1.1446\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the rise of the Internet, we now have a lot of information at our disposal. We 're swamped from many sources — news, social media, to name a few, office emails. This paper addresses the problem of reading through such extensive information by summarizing it using text summarizer based on Abstractive Summarization using deep learning models, i.e. using bidirectional Long Short-Term Memory (LSTM) networks and Pointer Generator mode. The LSTM model (which is a modification of the Recurrent Neural Network) is trained and tested on the Amazon Fine Food Review dataset using the Bahadau Attention Model Decoder with the use of Conceptnet Numberbatch embeddings that are very similar and better to GloVe. Pointer Generator mode is trained and tested by the CNN / Daily Mail dataset and the model uses both Decoder and Attention inputs. But due 2 major problems in LSTM model like the inability of the network to copy facts and repetition of words the second method is, i.e., Pointer Generator mode is used. This paper in turn aims to provide an analysis on both the models to provide a better understanding of the working of the models to enable to create a strong text summarizer. The main purpose is to provide reliable summaries of datasets or uploaded files, depending on the choice of the user. Unnecessary sentences will be rejected in order to obtain the most important sentences.\",\"PeriodicalId\":15073,\"journal\":{\"name\":\"Journal of Applied Research and Technology\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Research and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.22201/icat.24486736e.2023.21.1.1446\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Engineering\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Research and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.22201/icat.24486736e.2023.21.1.1446","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Engineering","Score":null,"Total":0}
引用次数: 1
摘要
随着互联网的兴起,我们现在有很多信息可供使用。我们被许多来源淹没了——新闻,社交媒体,办公室电子邮件等等。本文通过使用深度学习模型,即使用双向长短期记忆(LSTM)网络和指针生成器模式,使用基于抽象摘要的文本摘要器对如此广泛的信息进行总结,来解决阅读问题。LSTM模型(它是递归神经网络的一种修改)是使用Bahadau注意力模型解码器在Amazon Fine Food Review数据集上训练和测试的,并使用与GloVe非常相似且更好的Conceptnet Numberbatch嵌入。指针生成器模式由CNN/Daily Mail数据集进行训练和测试,该模型使用解码器和注意力输入。但是,由于LSTM模型中的两个主要问题,如网络无法复制事实和单词的重复,第二种方法是,即使用指针生成器模式。本文反过来旨在对这两个模型进行分析,以更好地了解模型的工作情况,从而创建一个强大的文本汇总器。主要目的是根据用户的选择提供数据集或上传文件的可靠摘要。为了获得最重要的句子,不必要的句子将被拒绝。
Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode
With the rise of the Internet, we now have a lot of information at our disposal. We 're swamped from many sources — news, social media, to name a few, office emails. This paper addresses the problem of reading through such extensive information by summarizing it using text summarizer based on Abstractive Summarization using deep learning models, i.e. using bidirectional Long Short-Term Memory (LSTM) networks and Pointer Generator mode. The LSTM model (which is a modification of the Recurrent Neural Network) is trained and tested on the Amazon Fine Food Review dataset using the Bahadau Attention Model Decoder with the use of Conceptnet Numberbatch embeddings that are very similar and better to GloVe. Pointer Generator mode is trained and tested by the CNN / Daily Mail dataset and the model uses both Decoder and Attention inputs. But due 2 major problems in LSTM model like the inability of the network to copy facts and repetition of words the second method is, i.e., Pointer Generator mode is used. This paper in turn aims to provide an analysis on both the models to provide a better understanding of the working of the models to enable to create a strong text summarizer. The main purpose is to provide reliable summaries of datasets or uploaded files, depending on the choice of the user. Unnecessary sentences will be rejected in order to obtain the most important sentences.
期刊介绍:
The Journal of Applied Research and Technology (JART) is a bimonthly open access journal that publishes papers on innovative applications, development of new technologies and efficient solutions in engineering, computing and scientific research. JART publishes manuscripts describing original research, with significant results based on experimental, theoretical and numerical work.
The journal does not charge for submission, processing, publication of manuscripts or for color reproduction of photographs.
JART classifies research into the following main fields:
-Material Science:
Biomaterials, carbon, ceramics, composite, metals, polymers, thin films, functional materials and semiconductors.
-Computer Science:
Computer graphics and visualization, programming, human-computer interaction, neural networks, image processing and software engineering.
-Industrial Engineering:
Operations research, systems engineering, management science, complex systems and cybernetics applications and information technologies
-Electronic Engineering:
Solid-state physics, radio engineering, telecommunications, control systems, signal processing, power electronics, electronic devices and circuits and automation.
-Instrumentation engineering and science:
Measurement devices (pressure, temperature, flow, voltage, frequency etc.), precision engineering, medical devices, instrumentation for education (devices and software), sensor technology, mechatronics and robotics.