Ke Ma;Bin Guo;Sicong Liu;Cheng Fang;Siqi Luo;Zimu Zheng;Zhiwen Yu
{"title":"AdaShift: Anti-Collapse and Real-Time Deep Model Evolution for Mobile Vision Applications","authors":"Ke Ma;Bin Guo;Sicong Liu;Cheng Fang;Siqi Luo;Zimu Zheng;Zhiwen Yu","doi":"10.1109/TMC.2025.3572215","DOIUrl":null,"url":null,"abstract":"As computational hardware advance, integrating deep learning (DL) models into mobile devices has become ubiquitous for visual tasks. However, “data distribution shift” in live sensory data can lead to a degradation in the accuracy of mobile DL models. Conventional domain adaptation methods, constrained by their dependence on pre-compiled static datasets for offline adaptation, exhibit fundamental limitations in real-time practicality. While modern online adaptation methodologies enable incremental model evolution, they remain plagued by two critical shortcomings: computational latency from excessive resource demands on mobile devices that compromise temporal responsiveness, and accuracy collapse stemming from error accumulation through unreliable pseudo-labeling processes. To address these challenges, we introduce AdaShift, an innovative cloud-assisted framework enabling real-time online model adaptation for vision-based mobile systems operating under non-stationary data distributions. Specifically, to ensure real-time performance, the adaptation trigger and plug-and-play adaptation mechanisms are proposed to minimize redundant adaptation requests and reduce per-request costs. To prevent accuracy collapse, AdaShift introduces a novel anti-collapse parameter restoration mechanism that explicitly recovers knowledge, ensuring stable accuracy improvements during model evolution. Through extensive experiments across various vision tasks and model architectures, AdaShift demonstrates superior accuracy and 100ms-level adaptation latency, achieving an optimal balance between accuracy and real-time performance compared to baselines.","PeriodicalId":50389,"journal":{"name":"IEEE Transactions on Mobile Computing","volume":"24 10","pages":"10573-10589"},"PeriodicalIF":9.2000,"publicationDate":"2025-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Mobile Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11015910/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
As computational hardware advance, integrating deep learning (DL) models into mobile devices has become ubiquitous for visual tasks. However, “data distribution shift” in live sensory data can lead to a degradation in the accuracy of mobile DL models. Conventional domain adaptation methods, constrained by their dependence on pre-compiled static datasets for offline adaptation, exhibit fundamental limitations in real-time practicality. While modern online adaptation methodologies enable incremental model evolution, they remain plagued by two critical shortcomings: computational latency from excessive resource demands on mobile devices that compromise temporal responsiveness, and accuracy collapse stemming from error accumulation through unreliable pseudo-labeling processes. To address these challenges, we introduce AdaShift, an innovative cloud-assisted framework enabling real-time online model adaptation for vision-based mobile systems operating under non-stationary data distributions. Specifically, to ensure real-time performance, the adaptation trigger and plug-and-play adaptation mechanisms are proposed to minimize redundant adaptation requests and reduce per-request costs. To prevent accuracy collapse, AdaShift introduces a novel anti-collapse parameter restoration mechanism that explicitly recovers knowledge, ensuring stable accuracy improvements during model evolution. Through extensive experiments across various vision tasks and model architectures, AdaShift demonstrates superior accuracy and 100ms-level adaptation latency, achieving an optimal balance between accuracy and real-time performance compared to baselines.
期刊介绍:
IEEE Transactions on Mobile Computing addresses key technical issues related to various aspects of mobile computing. This includes (a) architectures, (b) support services, (c) algorithm/protocol design and analysis, (d) mobile environments, (e) mobile communication systems, (f) applications, and (g) emerging technologies. Topics of interest span a wide range, covering aspects like mobile networks and hosts, mobility management, multimedia, operating system support, power management, online and mobile environments, security, scalability, reliability, and emerging technologies such as wearable computers, body area networks, and wireless sensor networks. The journal serves as a comprehensive platform for advancements in mobile computing research.