Junqing Fan , Xiaorong Tian , Chengyao Lv , Simin Zhang , Yuewei Wang , Junfeng Zhang
{"title":"基于MFMMR-BertSum的提取社交媒体文本摘要","authors":"Junqing Fan , Xiaorong Tian , Chengyao Lv , Simin Zhang , Yuewei Wang , Junfeng Zhang","doi":"10.1016/j.array.2023.100322","DOIUrl":null,"url":null,"abstract":"<div><p>The advancement of computer technology has led to an overwhelming amount of textual information, hindering the efficiency of knowledge intake. To address this issue, various text summarization techniques have been developed, including statistics, graph sorting, machine learning, and deep learning. However, the rich semantic features of text often interfere with the abstract effects and lack effective processing of redundant information. In this paper, we propose the Multi-Features Maximal Marginal Relevance BERT (MFMMR-BertSum) model for Extractive Summarization, which utilizes the pre-trained model BERT to tackle the text summarization task. The model incorporates a classification layer for extractive summarization. Additionally, the Maximal Marginal Relevance (MMR) component is utilized to remove information redundancy and optimize the summary results. The proposed method outperforms other sentence-level extractive summarization baseline methods on the CNN/DailyMail dataset, thus verifying its effectiveness.</p></div>","PeriodicalId":8417,"journal":{"name":"Array","volume":null,"pages":null},"PeriodicalIF":2.3000,"publicationDate":"2023-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Extractive social media text summarization based on MFMMR-BertSum\",\"authors\":\"Junqing Fan , Xiaorong Tian , Chengyao Lv , Simin Zhang , Yuewei Wang , Junfeng Zhang\",\"doi\":\"10.1016/j.array.2023.100322\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The advancement of computer technology has led to an overwhelming amount of textual information, hindering the efficiency of knowledge intake. To address this issue, various text summarization techniques have been developed, including statistics, graph sorting, machine learning, and deep learning. However, the rich semantic features of text often interfere with the abstract effects and lack effective processing of redundant information. In this paper, we propose the Multi-Features Maximal Marginal Relevance BERT (MFMMR-BertSum) model for Extractive Summarization, which utilizes the pre-trained model BERT to tackle the text summarization task. The model incorporates a classification layer for extractive summarization. Additionally, the Maximal Marginal Relevance (MMR) component is utilized to remove information redundancy and optimize the summary results. The proposed method outperforms other sentence-level extractive summarization baseline methods on the CNN/DailyMail dataset, thus verifying its effectiveness.</p></div>\",\"PeriodicalId\":8417,\"journal\":{\"name\":\"Array\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2023-10-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Array\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2590005623000474\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Array","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590005623000474","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
Extractive social media text summarization based on MFMMR-BertSum
The advancement of computer technology has led to an overwhelming amount of textual information, hindering the efficiency of knowledge intake. To address this issue, various text summarization techniques have been developed, including statistics, graph sorting, machine learning, and deep learning. However, the rich semantic features of text often interfere with the abstract effects and lack effective processing of redundant information. In this paper, we propose the Multi-Features Maximal Marginal Relevance BERT (MFMMR-BertSum) model for Extractive Summarization, which utilizes the pre-trained model BERT to tackle the text summarization task. The model incorporates a classification layer for extractive summarization. Additionally, the Maximal Marginal Relevance (MMR) component is utilized to remove information redundancy and optimize the summary results. The proposed method outperforms other sentence-level extractive summarization baseline methods on the CNN/DailyMail dataset, thus verifying its effectiveness.