Wang Xu, Shuo Wang, Weilin Zhao, Xu Han, Yukun Yan, Yudi Zhang, Zhe Tao, Zhiyuan Liu, Wanxiang Che
{"title":"以最低的培训成本实现实时对话","authors":"Wang Xu, Shuo Wang, Weilin Zhao, Xu Han, Yukun Yan, Yudi Zhang, Zhe Tao, Zhiyuan Liu, Wanxiang Che","doi":"arxiv-2409.11727","DOIUrl":null,"url":null,"abstract":"Large language models (LLMs) have demonstrated the ability to improve human\nefficiency through conversational interactions. Conventional LLM-powered\ndialogue systems, operating on a turn-based paradigm, preclude real-time\ninteraction during response generation. To address this limitation, researchers\nhave proposed duplex models. These models can dynamically adapt to user input,\nfacilitating real-time interactive feedback. However, these methods typically\nrequire substantial computational resources to acquire the ability. To reduce\noverhead, this paper presents a new duplex decoding approach that enhances LLMs\nwith duplex ability, requiring minimal additional training. Specifically, our\nmethod employs parallel decoding of queries and responses in conversations,\neffectively implementing a channel-division-multiplexing decoding strategy.\nExperimental results indicate that our proposed method significantly enhances\nthe naturalness and human-likeness of user-AI interactions with minimal\ntraining costs.","PeriodicalId":501030,"journal":{"name":"arXiv - CS - Computation and Language","volume":"53 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Enabling Real-Time Conversations with Minimal Training Costs\",\"authors\":\"Wang Xu, Shuo Wang, Weilin Zhao, Xu Han, Yukun Yan, Yudi Zhang, Zhe Tao, Zhiyuan Liu, Wanxiang Che\",\"doi\":\"arxiv-2409.11727\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Large language models (LLMs) have demonstrated the ability to improve human\\nefficiency through conversational interactions. Conventional LLM-powered\\ndialogue systems, operating on a turn-based paradigm, preclude real-time\\ninteraction during response generation. To address this limitation, researchers\\nhave proposed duplex models. These models can dynamically adapt to user input,\\nfacilitating real-time interactive feedback. However, these methods typically\\nrequire substantial computational resources to acquire the ability. To reduce\\noverhead, this paper presents a new duplex decoding approach that enhances LLMs\\nwith duplex ability, requiring minimal additional training. Specifically, our\\nmethod employs parallel decoding of queries and responses in conversations,\\neffectively implementing a channel-division-multiplexing decoding strategy.\\nExperimental results indicate that our proposed method significantly enhances\\nthe naturalness and human-likeness of user-AI interactions with minimal\\ntraining costs.\",\"PeriodicalId\":501030,\"journal\":{\"name\":\"arXiv - CS - Computation and Language\",\"volume\":\"53 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Computation and Language\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.11727\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computation and Language","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11727","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Enabling Real-Time Conversations with Minimal Training Costs
Large language models (LLMs) have demonstrated the ability to improve human
efficiency through conversational interactions. Conventional LLM-powered
dialogue systems, operating on a turn-based paradigm, preclude real-time
interaction during response generation. To address this limitation, researchers
have proposed duplex models. These models can dynamically adapt to user input,
facilitating real-time interactive feedback. However, these methods typically
require substantial computational resources to acquire the ability. To reduce
overhead, this paper presents a new duplex decoding approach that enhances LLMs
with duplex ability, requiring minimal additional training. Specifically, our
method employs parallel decoding of queries and responses in conversations,
effectively implementing a channel-division-multiplexing decoding strategy.
Experimental results indicate that our proposed method significantly enhances
the naturalness and human-likeness of user-AI interactions with minimal
training costs.