Yaqi Chen, Hao Zhang, Wenlin Zhang, Dan Qu, Xukui Yang
{"title":"A Lightweight Task-Agreement Meta Learning for Low-Resource Speech Recognition","authors":"Yaqi Chen, Hao Zhang, Wenlin Zhang, Dan Qu, Xukui Yang","doi":"10.1007/s11063-024-11661-6","DOIUrl":null,"url":null,"abstract":"<p>Meta-learning has proven to be a powerful paradigm for transferring knowledge from prior tasks to facilitate the quick learning of new tasks in automatic speech recognition. However, the differences between languages (tasks) lead to variations in task learning directions, causing the harmful competition for model’s limited resources. To address this challenge, we introduce the task-agreement multilingual meta-learning (TAMML), which adopts the gradient agreement algorithm to guide the model parameters towards a direction where tasks exhibit greater consistency. However, the computation and storage cost of TAMML grows dramatically with model’s depth increases. To address this, we further propose a simplification called TAMML-Light which only uses the output layer for gradient calculation. Experiments on three datasets demonstrate that TAMML and TAMML-Light achieve outperform meta-learning approaches, yielding superior results.Furthermore, TAMML-Light can reduce at least 80 <span>\\(\\%\\)</span> of the relative increased computation expenses compared to TAMML.</p>","PeriodicalId":51144,"journal":{"name":"Neural Processing Letters","volume":"17 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-07-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Processing Letters","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s11063-024-11661-6","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Meta-learning has proven to be a powerful paradigm for transferring knowledge from prior tasks to facilitate the quick learning of new tasks in automatic speech recognition. However, the differences between languages (tasks) lead to variations in task learning directions, causing the harmful competition for model’s limited resources. To address this challenge, we introduce the task-agreement multilingual meta-learning (TAMML), which adopts the gradient agreement algorithm to guide the model parameters towards a direction where tasks exhibit greater consistency. However, the computation and storage cost of TAMML grows dramatically with model’s depth increases. To address this, we further propose a simplification called TAMML-Light which only uses the output layer for gradient calculation. Experiments on three datasets demonstrate that TAMML and TAMML-Light achieve outperform meta-learning approaches, yielding superior results.Furthermore, TAMML-Light can reduce at least 80 \(\%\) of the relative increased computation expenses compared to TAMML.
期刊介绍:
Neural Processing Letters is an international journal publishing research results and innovative ideas on all aspects of artificial neural networks. Coverage includes theoretical developments, biological models, new formal modes, learning, applications, software and hardware developments, and prospective researches.
The journal promotes fast exchange of information in the community of neural network researchers and users. The resurgence of interest in the field of artificial neural networks since the beginning of the 1980s is coupled to tremendous research activity in specialized or multidisciplinary groups. Research, however, is not possible without good communication between people and the exchange of information, especially in a field covering such different areas; fast communication is also a key aspect, and this is the reason for Neural Processing Letters