Ralivat Haruna, A. Obiniyi, Muhammed Abdulkarim, A.A. Afolorunsho
{"title":"Automatic Summarization of Scientific Documents Using Transformer Architectures: A Review","authors":"Ralivat Haruna, A. Obiniyi, Muhammed Abdulkarim, A.A. Afolorunsho","doi":"10.1109/ITED56637.2022.10051602","DOIUrl":null,"url":null,"abstract":"As technology advances, the volume of textual material produced on the web has been steadily rising. It can take a lot of time and effort to extract useful information from textual data. Automatic text summarizing aims to create concise summaries that retain the most important parts of the source document. Transformer-based architectures have demonstrated excellently in Natural Language Processing (NLP), particularly when it comes to summarizing textual content. This paper presents a thorough analysis of the most recent advancements in transformer topologies for automatic text summarization, with a focus on the Bidirectional Autoregressive Transformer (BART). This paper highlights future directions for research on transformer-like models for autonomous text summarization, such as BART and BERT.","PeriodicalId":246041,"journal":{"name":"2022 5th Information Technology for Education and Development (ITED)","volume":"76 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 5th Information Technology for Education and Development (ITED)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITED56637.2022.10051602","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
As technology advances, the volume of textual material produced on the web has been steadily rising. It can take a lot of time and effort to extract useful information from textual data. Automatic text summarizing aims to create concise summaries that retain the most important parts of the source document. Transformer-based architectures have demonstrated excellently in Natural Language Processing (NLP), particularly when it comes to summarizing textual content. This paper presents a thorough analysis of the most recent advancements in transformer topologies for automatic text summarization, with a focus on the Bidirectional Autoregressive Transformer (BART). This paper highlights future directions for research on transformer-like models for autonomous text summarization, such as BART and BERT.