{"title":"Meta-Adaptable-Adapter: Efficient adaptation of self-supervised models for low-resource speech recognition","authors":"","doi":"10.1016/j.neucom.2024.128493","DOIUrl":null,"url":null,"abstract":"<div><p>Self-supervised models have demonstrated remarkable performance in speech processing by learning latent representations from large amounts of unlabeled data. Adapting these models to low-resource languages yields promising results, but the computational cost of fine-tuning all model parameters is prohibitively high. Adapters offer a solution by introducing lightweight bottleneck structures into pre-trained models for downstream tasks, enabling efficient parameter adaptation. However, randomly initialized adapters often underperform in extremely low-resource scenarios. To address this issue, we explore the Meta-Adapter for self-supervised models and analyzed some limitations of Meta-Adapter including poor learning in language-specific knowledge and meta-overfitting problems. To relieve these problems, we propose the Meta-Adaptable-Adapter (MAA), a new meta leaning algorithm that adapts to low-resource languages quickly and effectively. MAA learns task-specific adapters for feature extraction, and task-independent adapters for feature combination. The experiments on three datasets show superior performance on 31 low-resource languages across seven different language families compared to other adapters, showing better generalization and extensibility.</p></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224012645","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Self-supervised models have demonstrated remarkable performance in speech processing by learning latent representations from large amounts of unlabeled data. Adapting these models to low-resource languages yields promising results, but the computational cost of fine-tuning all model parameters is prohibitively high. Adapters offer a solution by introducing lightweight bottleneck structures into pre-trained models for downstream tasks, enabling efficient parameter adaptation. However, randomly initialized adapters often underperform in extremely low-resource scenarios. To address this issue, we explore the Meta-Adapter for self-supervised models and analyzed some limitations of Meta-Adapter including poor learning in language-specific knowledge and meta-overfitting problems. To relieve these problems, we propose the Meta-Adaptable-Adapter (MAA), a new meta leaning algorithm that adapts to low-resource languages quickly and effectively. MAA learns task-specific adapters for feature extraction, and task-independent adapters for feature combination. The experiments on three datasets show superior performance on 31 low-resource languages across seven different language families compared to other adapters, showing better generalization and extensibility.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.