{"title":"Robust transfer learning under generalized linear errors-in-variables models","authors":"Zhenglong Zhang, Houlin Zhou, Xuejun Wang","doi":"10.1016/j.jspi.2026.106378","DOIUrl":null,"url":null,"abstract":"<div><div>Transfer learning enhances statistical modeling by utilizing source-task information, but its effectiveness can be compromised when the common assumption of error-free covariates is violated, as measurement error often leads to biased estimates and invalid inference. To address this critical issue, we propose a novel transfer learning framework for generalized linear errors-in-variables models (GLEVMs), which account for classical additive measurement error in covariates. We introduce a functional similarity structure linking source and target parameters, and develop the errors-in-variables transfer learning likelihood (ev-TLL) method based on weighted likelihood. Under mild regularity conditions, we establish the asymptotic normality of the proposed estimator and demonstrate that it achieves faster convergence rates than traditional methods without transfer learning. Extensive simulations under both linear and nonlinear GLEVMs confirm the superior estimation accuracy of our approach. Finally, a real data application to the Maryland Biological Stream Survey highlights the practical benefits of ev-TLL over models using only target-domain data.</div></div>","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"243 ","pages":"Article 106378"},"PeriodicalIF":0.8000,"publicationDate":"2026-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375826000066","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/1/24 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
Transfer learning enhances statistical modeling by utilizing source-task information, but its effectiveness can be compromised when the common assumption of error-free covariates is violated, as measurement error often leads to biased estimates and invalid inference. To address this critical issue, we propose a novel transfer learning framework for generalized linear errors-in-variables models (GLEVMs), which account for classical additive measurement error in covariates. We introduce a functional similarity structure linking source and target parameters, and develop the errors-in-variables transfer learning likelihood (ev-TLL) method based on weighted likelihood. Under mild regularity conditions, we establish the asymptotic normality of the proposed estimator and demonstrate that it achieves faster convergence rates than traditional methods without transfer learning. Extensive simulations under both linear and nonlinear GLEVMs confirm the superior estimation accuracy of our approach. Finally, a real data application to the Maryland Biological Stream Survey highlights the practical benefits of ev-TLL over models using only target-domain data.
期刊介绍:
The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists.
We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.