Tian-Shuang Wu , Shen-Huan Lyu , Yanyan Wang , Ning Chen , Zhihao Qu , Baoliu Ye
{"title":"Compressing model with few class-imbalance samples: An out-of-distribution expedition","authors":"Tian-Shuang Wu , Shen-Huan Lyu , Yanyan Wang , Ning Chen , Zhihao Qu , Baoliu Ye","doi":"10.1016/j.patrec.2026.01.010","DOIUrl":null,"url":null,"abstract":"<div><div>Few-sample model compression aims to compress a large pre-trained model into a compact one using only a few samples. However, previous methods typically assume a balanced class distribution, which is costly under severe data scarcity. In the presence of imbalance, the compressed model exhibits significant performance degradation. We propose a novel framework named OOD-Enhanced Few-Sample Model Compression (OE-FSMC), introducing out-of-distribution (OOD) samples with dynamically assigned labels to prevent bias during the compression process. To avoid overfitting the OOD samples, we incorporate a joint distillation loss and a class-dependent regularization term. Extensive experiments on multiple benchmark datasets show that our framework can be seamlessly incorporated into existing few-sample model compression methods, effectively mitigating the accuracy degradation caused by class imbalance.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"201 ","pages":"Pages 117-124"},"PeriodicalIF":3.3000,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S016786552600022X","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/1/13 0:00:00","PubModel":"Epub","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Few-sample model compression aims to compress a large pre-trained model into a compact one using only a few samples. However, previous methods typically assume a balanced class distribution, which is costly under severe data scarcity. In the presence of imbalance, the compressed model exhibits significant performance degradation. We propose a novel framework named OOD-Enhanced Few-Sample Model Compression (OE-FSMC), introducing out-of-distribution (OOD) samples with dynamically assigned labels to prevent bias during the compression process. To avoid overfitting the OOD samples, we incorporate a joint distillation loss and a class-dependent regularization term. Extensive experiments on multiple benchmark datasets show that our framework can be seamlessly incorporated into existing few-sample model compression methods, effectively mitigating the accuracy degradation caused by class imbalance.
少样本模型压缩的目的是利用少量样本将一个大的预训练模型压缩成一个紧凑的模型。然而,以前的方法通常假设类分布平衡,这在严重的数据稀缺性下是昂贵的。在存在不平衡的情况下,压缩模型表现出明显的性能下降。我们提出了一个新的框架,称为OOD增强的少样本模型压缩(OE-FSMC),引入了带有动态分配标签的out- distribution (OOD)样本,以防止压缩过程中的偏差。为了避免OOD样本的过拟合,我们结合了一个联合蒸馏损失和一个类相关的正则化项。在多个基准数据集上的大量实验表明,我们的框架可以无缝地结合到现有的少样本模型压缩方法中,有效地缓解了类不平衡导致的精度下降。
期刊介绍:
Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition.
Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.