{"title":"Entropy-Guided Meta-Initialization regularization for few-shot text classification","authors":"Jongyun Shin , Jinwoo Kim , Jangho Kim","doi":"10.1016/j.knosys.2025.113961","DOIUrl":null,"url":null,"abstract":"<div><div>Meta-learning has emerged as the dominant approach for tackling the problem of limited datasets in text classification and achieved state-of-the-art performance. In particular, gradient-based meta-learning finds good meta-initialization (parameters) through learning on a variety of tasks, which is known to be effective for the few-shot problem. However, we find that meta-initialization of the existing method has an over-confidence for the specific class before task adaptation, which potentially leads to deterioration in generalization performance. To address this issue, we propose a simple and effective Entropy-Guided Meta-Initialization regularization (EGMI) method. The proposed EGMI focuses on maximizing entropy so that the model has meta-initialization to prevent over-confidence for specific classes. In our experiments, we show that EGMI outperforms the current state-of-the-art methods on few-shot benchmark datasets by a large margin. In particular, we achieve a performance improvement from 84.13% to 91.36% on the 15-way 5-shot of the clinc150 dataset. We also do not need any additional parameters and the training cost does not increase. Our code is publicly available <span><span>https://anonymous.4open.science/r/EGMI-9D60/README.md</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"326 ","pages":"Article 113961"},"PeriodicalIF":7.6000,"publicationDate":"2025-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125010068","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Meta-learning has emerged as the dominant approach for tackling the problem of limited datasets in text classification and achieved state-of-the-art performance. In particular, gradient-based meta-learning finds good meta-initialization (parameters) through learning on a variety of tasks, which is known to be effective for the few-shot problem. However, we find that meta-initialization of the existing method has an over-confidence for the specific class before task adaptation, which potentially leads to deterioration in generalization performance. To address this issue, we propose a simple and effective Entropy-Guided Meta-Initialization regularization (EGMI) method. The proposed EGMI focuses on maximizing entropy so that the model has meta-initialization to prevent over-confidence for specific classes. In our experiments, we show that EGMI outperforms the current state-of-the-art methods on few-shot benchmark datasets by a large margin. In particular, we achieve a performance improvement from 84.13% to 91.36% on the 15-way 5-shot of the clinc150 dataset. We also do not need any additional parameters and the training cost does not increase. Our code is publicly available https://anonymous.4open.science/r/EGMI-9D60/README.md.
期刊介绍:
Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.