文本摘要集群转换器(TSCT)

R. D. Ahmed, M. Abdulhak, Omar Hesham ELNabrawy
{"title":"文本摘要集群转换器(TSCT)","authors":"R. D. Ahmed, M. Abdulhak, Omar Hesham ELNabrawy","doi":"10.1109/FAIML57028.2022.00041","DOIUrl":null,"url":null,"abstract":"Natural language processing has recently had a considerable reputation due to the quick increase in online and offline data worldwide. The Extractive text summarization grabs the sentence from the corpus using salient related information to produce a concise summary. However, most existing approach to extracting sentence feature engineering has not utilized related contextual information and relation among the sentence. We present clustered Transformer models to mitigate this issue, namely Text summarization using clustered Transformer models. Our proposal has the highest benefit. The utility of our frame-work working on contextual representation is to grab various linguistic context information. We also use surface features to improve our understanding of word and sentence elements. Another utility is that the hierarchical attention mechanism can capture the contextual relation from the word and sentence levels using the transform model. Also, we added clustering after the transformer model to capture the most similar sentence to improve the attentive quality for producing the extractive text summarization.","PeriodicalId":307172,"journal":{"name":"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)","volume":"58-60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Text Summarization Clustered Transformer (TSCT)\",\"authors\":\"R. D. Ahmed, M. Abdulhak, Omar Hesham ELNabrawy\",\"doi\":\"10.1109/FAIML57028.2022.00041\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Natural language processing has recently had a considerable reputation due to the quick increase in online and offline data worldwide. The Extractive text summarization grabs the sentence from the corpus using salient related information to produce a concise summary. However, most existing approach to extracting sentence feature engineering has not utilized related contextual information and relation among the sentence. We present clustered Transformer models to mitigate this issue, namely Text summarization using clustered Transformer models. Our proposal has the highest benefit. The utility of our frame-work working on contextual representation is to grab various linguistic context information. We also use surface features to improve our understanding of word and sentence elements. Another utility is that the hierarchical attention mechanism can capture the contextual relation from the word and sentence levels using the transform model. Also, we added clustering after the transformer model to capture the most similar sentence to improve the attentive quality for producing the extractive text summarization.\",\"PeriodicalId\":307172,\"journal\":{\"name\":\"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)\",\"volume\":\"58-60 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FAIML57028.2022.00041\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FAIML57028.2022.00041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

由于全球在线和离线数据的快速增长,自然语言处理最近获得了相当大的声誉。摘要语篇提取是指从语料库中提取句子,利用相关的重要信息生成简洁的摘要语篇。然而,大多数现有的句子特征工程提取方法都没有利用句子之间的相关上下文信息和关系。我们提出了集群Transformer模型来缓解这个问题,即使用集群Transformer模型进行文本摘要。我们的建议有最大的好处。我们的框架在语境表征方面的作用是获取各种语言语境信息。我们还使用表面特征来提高我们对单词和句子元素的理解。另一个用途是,层次注意机制可以使用转换模型从单词和句子级别捕获上下文关系。此外,我们在变压器模型之后加入聚类来捕获最相似的句子,以提高提取文本摘要的注意质量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Text Summarization Clustered Transformer (TSCT)
Natural language processing has recently had a considerable reputation due to the quick increase in online and offline data worldwide. The Extractive text summarization grabs the sentence from the corpus using salient related information to produce a concise summary. However, most existing approach to extracting sentence feature engineering has not utilized related contextual information and relation among the sentence. We present clustered Transformer models to mitigate this issue, namely Text summarization using clustered Transformer models. Our proposal has the highest benefit. The utility of our frame-work working on contextual representation is to grab various linguistic context information. We also use surface features to improve our understanding of word and sentence elements. Another utility is that the hierarchical attention mechanism can capture the contextual relation from the word and sentence levels using the transform model. Also, we added clustering after the transformer model to capture the most similar sentence to improve the attentive quality for producing the extractive text summarization.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信