K. Girthana, S. Swamynathan, A. R. Nirupama, S. Sri Akshya, S. Adhithyan
{"title":"基于web的科学论文摘要预训练变压器模型(WPT-SPS)","authors":"K. Girthana, S. Swamynathan, A. R. Nirupama, S. Sri Akshya, S. Adhithyan","doi":"10.1109/AISC56616.2023.10085409","DOIUrl":null,"url":null,"abstract":"The rapid growth of scientific publications becomes challenging for researchers to swiftly learn about breakthroughs in their domains. This challenge is addressed by scientific summarization, which provides summaries of the important contributions of scientific papers. This paper proposes a transfer learning technique for scientific paper summarization to generated abstractive summaries of scientific papers in a particular domain. The proposed model attained improvement of around 12% and 20% than the state-of-the art models such as BART and Longformers.","PeriodicalId":408520,"journal":{"name":"2023 International Conference on Artificial Intelligence and Smart Communication (AISC)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Web-based Pretrained Transformer Model for Scientific Paper Summarization (WPT-SPS)\",\"authors\":\"K. Girthana, S. Swamynathan, A. R. Nirupama, S. Sri Akshya, S. Adhithyan\",\"doi\":\"10.1109/AISC56616.2023.10085409\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The rapid growth of scientific publications becomes challenging for researchers to swiftly learn about breakthroughs in their domains. This challenge is addressed by scientific summarization, which provides summaries of the important contributions of scientific papers. This paper proposes a transfer learning technique for scientific paper summarization to generated abstractive summaries of scientific papers in a particular domain. The proposed model attained improvement of around 12% and 20% than the state-of-the art models such as BART and Longformers.\",\"PeriodicalId\":408520,\"journal\":{\"name\":\"2023 International Conference on Artificial Intelligence and Smart Communication (AISC)\",\"volume\":\"67 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Artificial Intelligence and Smart Communication (AISC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/AISC56616.2023.10085409\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Artificial Intelligence and Smart Communication (AISC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AISC56616.2023.10085409","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Web-based Pretrained Transformer Model for Scientific Paper Summarization (WPT-SPS)
The rapid growth of scientific publications becomes challenging for researchers to swiftly learn about breakthroughs in their domains. This challenge is addressed by scientific summarization, which provides summaries of the important contributions of scientific papers. This paper proposes a transfer learning technique for scientific paper summarization to generated abstractive summaries of scientific papers in a particular domain. The proposed model attained improvement of around 12% and 20% than the state-of-the art models such as BART and Longformers.