{"title":"HITSZ TMG at ICASSP 2023 SPGC Shared Task: Leveraging Pre-Training and Distillation Method for Title Generation with Limited Resource","authors":"Tianxiao Xu, Zihao Zheng, Xinshuo Hu, Zetian Sun, Yu Zhao, Baotian Hu","doi":"10.1109/icassp49357.2023.10097026","DOIUrl":null,"url":null,"abstract":"In this paper, we present our proposed method for the shared task of the ICASSP 2023 Signal Processing Grand Challenge (SPGC). We participate in Topic Title Generation (TTG), Track 3 of General Meeting Understanding and Generation (MUG) [1] in SPGC. The primary objective of this task is to generate a title that effectively summarizes the given topic segment. With the constraints of limited model size and external dataset availability, we propose a method as Pre-training - Distillation / Fine-tuning (PDF), which can efficiently leverage the knowledge from large model and corpus. Our method achieves first place during preliminary and final contests in ICASSP2023 MUG Challenge Track 3.","PeriodicalId":113072,"journal":{"name":"ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/icassp49357.2023.10097026","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In this paper, we present our proposed method for the shared task of the ICASSP 2023 Signal Processing Grand Challenge (SPGC). We participate in Topic Title Generation (TTG), Track 3 of General Meeting Understanding and Generation (MUG) [1] in SPGC. The primary objective of this task is to generate a title that effectively summarizes the given topic segment. With the constraints of limited model size and external dataset availability, we propose a method as Pre-training - Distillation / Fine-tuning (PDF), which can efficiently leverage the knowledge from large model and corpus. Our method achieves first place during preliminary and final contests in ICASSP2023 MUG Challenge Track 3.