Text Summarization using Transformer Model

Jaishree Ranganathan, Gloria Abuka
{"title":"Text Summarization using Transformer Model","authors":"Jaishree Ranganathan, Gloria Abuka","doi":"10.1109/SNAMS58071.2022.10062698","DOIUrl":null,"url":null,"abstract":"The increased availability of online feedback or review tools, and the enormous amount of information on these platforms, have made text summarization a vital research area in natural language processing. Instead of potential consumers going through thousands of reviews to get needed information, summarization will enable them to see a concise form of a chunk of reviews with relevant information. News and scientific articles have been used in text summarization models. This study proposes a text summarization method based on the Text-to- Text Transfer Transformer (T5) model. We use the University of California, Irvine's (UCI) drug reviews dataset. We manually created human summaries for the ten most useful reviews of a particular drug for 500 different drugs from the dataset. We fine-tune the Text-to- Text Transfer Transformer (T5) model to perform abstractive text summarization. The model's effectiveness was evaluated using the ROUGE metrics, and our model achieved an average of ROUGE1, ROUGE2, and ROUGEL scores of 45.62, 25.58, and 36.53, respectively. We also fine-tuned this model on a standard dataset(BBC News Dataset) previously used for text summarization and got average ROUGE1, ROUGE2, and ROUGEL scores of 69.05, 59.70, and 52.97, respectively.","PeriodicalId":371668,"journal":{"name":"2022 Ninth International Conference on Social Networks Analysis, Management and Security (SNAMS)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Ninth International Conference on Social Networks Analysis, Management and Security (SNAMS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SNAMS58071.2022.10062698","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

The increased availability of online feedback or review tools, and the enormous amount of information on these platforms, have made text summarization a vital research area in natural language processing. Instead of potential consumers going through thousands of reviews to get needed information, summarization will enable them to see a concise form of a chunk of reviews with relevant information. News and scientific articles have been used in text summarization models. This study proposes a text summarization method based on the Text-to- Text Transfer Transformer (T5) model. We use the University of California, Irvine's (UCI) drug reviews dataset. We manually created human summaries for the ten most useful reviews of a particular drug for 500 different drugs from the dataset. We fine-tune the Text-to- Text Transfer Transformer (T5) model to perform abstractive text summarization. The model's effectiveness was evaluated using the ROUGE metrics, and our model achieved an average of ROUGE1, ROUGE2, and ROUGEL scores of 45.62, 25.58, and 36.53, respectively. We also fine-tuned this model on a standard dataset(BBC News Dataset) previously used for text summarization and got average ROUGE1, ROUGE2, and ROUGEL scores of 69.05, 59.70, and 52.97, respectively.
使用Transformer模型的文本摘要
在线反馈或审查工具的可用性增加,以及这些平台上的大量信息,使得文本摘要成为自然语言处理的一个重要研究领域。与其让潜在消费者通过成千上万的评论来获取所需的信息,总结将使他们能够看到带有相关信息的评论块的简明形式。新闻和科学文章已被用于文本摘要模型。本研究提出一种基于文本到文本传输转换器(T5)模型的文本摘要方法。我们使用加州大学欧文分校(UCI)的药物评论数据集。我们从数据集中为500种不同的药物手动创建了10种最有用的特定药物评论的人类摘要。我们对文本到文本传输转换器(T5)模型进行了微调,以执行抽象文本摘要。使用ROUGE指标对模型的有效性进行了评估,我们的模型分别达到了ROUGE1、ROUGE2和ROUGEL的平均得分45.62、25.58和36.53。我们还在之前用于文本摘要的标准数据集(BBC新闻数据集)上对该模型进行了微调,得到了ROUGE1、ROUGE2和ROUGEL的平均得分分别为69.05、59.70和52.97。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信