{"title":"文本摘要集群转换器(TSCT)","authors":"R. D. Ahmed, M. Abdulhak, Omar Hesham ELNabrawy","doi":"10.1109/FAIML57028.2022.00041","DOIUrl":null,"url":null,"abstract":"Natural language processing has recently had a considerable reputation due to the quick increase in online and offline data worldwide. The Extractive text summarization grabs the sentence from the corpus using salient related information to produce a concise summary. However, most existing approach to extracting sentence feature engineering has not utilized related contextual information and relation among the sentence. We present clustered Transformer models to mitigate this issue, namely Text summarization using clustered Transformer models. Our proposal has the highest benefit. The utility of our frame-work working on contextual representation is to grab various linguistic context information. We also use surface features to improve our understanding of word and sentence elements. Another utility is that the hierarchical attention mechanism can capture the contextual relation from the word and sentence levels using the transform model. Also, we added clustering after the transformer model to capture the most similar sentence to improve the attentive quality for producing the extractive text summarization.","PeriodicalId":307172,"journal":{"name":"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)","volume":"58-60 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Text Summarization Clustered Transformer (TSCT)\",\"authors\":\"R. D. Ahmed, M. Abdulhak, Omar Hesham ELNabrawy\",\"doi\":\"10.1109/FAIML57028.2022.00041\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Natural language processing has recently had a considerable reputation due to the quick increase in online and offline data worldwide. The Extractive text summarization grabs the sentence from the corpus using salient related information to produce a concise summary. However, most existing approach to extracting sentence feature engineering has not utilized related contextual information and relation among the sentence. We present clustered Transformer models to mitigate this issue, namely Text summarization using clustered Transformer models. Our proposal has the highest benefit. The utility of our frame-work working on contextual representation is to grab various linguistic context information. We also use surface features to improve our understanding of word and sentence elements. Another utility is that the hierarchical attention mechanism can capture the contextual relation from the word and sentence levels using the transform model. Also, we added clustering after the transformer model to capture the most similar sentence to improve the attentive quality for producing the extractive text summarization.\",\"PeriodicalId\":307172,\"journal\":{\"name\":\"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)\",\"volume\":\"58-60 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FAIML57028.2022.00041\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FAIML57028.2022.00041","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Natural language processing has recently had a considerable reputation due to the quick increase in online and offline data worldwide. The Extractive text summarization grabs the sentence from the corpus using salient related information to produce a concise summary. However, most existing approach to extracting sentence feature engineering has not utilized related contextual information and relation among the sentence. We present clustered Transformer models to mitigate this issue, namely Text summarization using clustered Transformer models. Our proposal has the highest benefit. The utility of our frame-work working on contextual representation is to grab various linguistic context information. We also use surface features to improve our understanding of word and sentence elements. Another utility is that the hierarchical attention mechanism can capture the contextual relation from the word and sentence levels using the transform model. Also, we added clustering after the transformer model to capture the most similar sentence to improve the attentive quality for producing the extractive text summarization.