Jingjing Liu, Yishuai Song, Rui Jiang, Yi Feng, Mo Tao, Yinlin Li
{"title":"任务内嵌再参数化:一种基于多任务提示学习的任务特定迁移增强新框架","authors":"Jingjing Liu, Yishuai Song, Rui Jiang, Yi Feng, Mo Tao, Yinlin Li","doi":"10.1155/int/1688391","DOIUrl":null,"url":null,"abstract":"<p>Current fine-tuning techniques for large pretrained language models (LLMs) face significant challenges, particularly regarding the high computational costs associated with adapting billions of parameters and their limitations in effectively addressing diverse language understanding tasks. These methods often result in an inability to manage inter-task dependencies effectively, leading to underutilization of inter-task information. To address these issues, we propose tasks-embedded reparameterization (TER), a novel parameter-efficient fine-tuning framework that exploits multitask learning to enhance task-specific capabilities. The TER model integrates prompt tuning and multitask reparameterization, merging task-specific experts and hidden states of target tasks in a unified model framework. Furthermore, it employs a dynamic, task-oriented gating mechanism to optimize the prompts output by the model. This method dynamically adjusts the parameters according to the differing requirements of the task, ensuring that the model optimally adjusts the parameters according to the specific requirements of the task, so that the task can find a suitable balance between different tasks and improve knowledge sharing and task adaptability. Experimental evaluations using the SuperGLUE benchmark demonstrate that TER consistently outperforms existing parameter-efficient fine-tuning techniques in both performance and computational efficiency, offering a promising solution for task-specific language understanding in both research and industry.</p>","PeriodicalId":14089,"journal":{"name":"International Journal of Intelligent Systems","volume":"2025 1","pages":""},"PeriodicalIF":3.7000,"publicationDate":"2025-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1155/int/1688391","citationCount":"0","resultStr":"{\"title\":\"Tasks-Embedded Reparameterization: A Novel Framework for Task-Specific Transfer Enhancement With Multitask Prompt Learning\",\"authors\":\"Jingjing Liu, Yishuai Song, Rui Jiang, Yi Feng, Mo Tao, Yinlin Li\",\"doi\":\"10.1155/int/1688391\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Current fine-tuning techniques for large pretrained language models (LLMs) face significant challenges, particularly regarding the high computational costs associated with adapting billions of parameters and their limitations in effectively addressing diverse language understanding tasks. These methods often result in an inability to manage inter-task dependencies effectively, leading to underutilization of inter-task information. To address these issues, we propose tasks-embedded reparameterization (TER), a novel parameter-efficient fine-tuning framework that exploits multitask learning to enhance task-specific capabilities. The TER model integrates prompt tuning and multitask reparameterization, merging task-specific experts and hidden states of target tasks in a unified model framework. Furthermore, it employs a dynamic, task-oriented gating mechanism to optimize the prompts output by the model. This method dynamically adjusts the parameters according to the differing requirements of the task, ensuring that the model optimally adjusts the parameters according to the specific requirements of the task, so that the task can find a suitable balance between different tasks and improve knowledge sharing and task adaptability. Experimental evaluations using the SuperGLUE benchmark demonstrate that TER consistently outperforms existing parameter-efficient fine-tuning techniques in both performance and computational efficiency, offering a promising solution for task-specific language understanding in both research and industry.</p>\",\"PeriodicalId\":14089,\"journal\":{\"name\":\"International Journal of Intelligent Systems\",\"volume\":\"2025 1\",\"pages\":\"\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2025-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1155/int/1688391\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Intelligent Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1155/int/1688391\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1155/int/1688391","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Tasks-Embedded Reparameterization: A Novel Framework for Task-Specific Transfer Enhancement With Multitask Prompt Learning
Current fine-tuning techniques for large pretrained language models (LLMs) face significant challenges, particularly regarding the high computational costs associated with adapting billions of parameters and their limitations in effectively addressing diverse language understanding tasks. These methods often result in an inability to manage inter-task dependencies effectively, leading to underutilization of inter-task information. To address these issues, we propose tasks-embedded reparameterization (TER), a novel parameter-efficient fine-tuning framework that exploits multitask learning to enhance task-specific capabilities. The TER model integrates prompt tuning and multitask reparameterization, merging task-specific experts and hidden states of target tasks in a unified model framework. Furthermore, it employs a dynamic, task-oriented gating mechanism to optimize the prompts output by the model. This method dynamically adjusts the parameters according to the differing requirements of the task, ensuring that the model optimally adjusts the parameters according to the specific requirements of the task, so that the task can find a suitable balance between different tasks and improve knowledge sharing and task adaptability. Experimental evaluations using the SuperGLUE benchmark demonstrate that TER consistently outperforms existing parameter-efficient fine-tuning techniques in both performance and computational efficiency, offering a promising solution for task-specific language understanding in both research and industry.
期刊介绍:
The International Journal of Intelligent Systems serves as a forum for individuals interested in tapping into the vast theories based on intelligent systems construction. With its peer-reviewed format, the journal explores several fascinating editorials written by today''s experts in the field. Because new developments are being introduced each day, there''s much to be learned — examination, analysis creation, information retrieval, man–computer interactions, and more. The International Journal of Intelligent Systems uses charts and illustrations to demonstrate these ground-breaking issues, and encourages readers to share their thoughts and experiences.