{"title":"TinyML-Driven On-Device Sports Command Recognition in Mobile and Dynamic Environments","authors":"Jiali Zang","doi":"10.1002/itl2.70090","DOIUrl":null,"url":null,"abstract":"<div>\n \n <p>In this article, we propose a novel TinyML-based framework for real-time sports command recognition under mobile conditions. Unlike conventional Human Activity Recognition (HAR) systems that rely on cloud-based processing or heavy on-device models, our method leverages lightweight deep neural networks, personalized transfer learning, and signal augmentation techniques to perform low-latency and energy-efficient inference directly on microcontroller-class devices. The system is designed to recognize a set of critical sports instructions (e.g., “Start Running,” “Jump,” and “Sprint”) in mobile or outdoor environments using only wearable inertial sensors. Extensive experiments demonstrate our method outperforms several state-of-the-art baselines in accuracy (95.8%), model size (14.5 KB), and energy efficiency (0.82 mJ per inference). Compared to prior wearable HAR systems, our method uniquely integrates motion-aware segmentation and user-personalized few-shot adaptation, resulting in a 5.3% accuracy gain and 4× model compression over baseline TinyML frameworks. The proposed method provides an effective balance between model accuracy, generalization, and hardware efficiency, even in scenarios with significant motion noise and environmental variability.</p>\n </div>","PeriodicalId":100725,"journal":{"name":"Internet Technology Letters","volume":"8 5","pages":""},"PeriodicalIF":0.5000,"publicationDate":"2025-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Internet Technology Letters","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/itl2.70090","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"TELECOMMUNICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
In this article, we propose a novel TinyML-based framework for real-time sports command recognition under mobile conditions. Unlike conventional Human Activity Recognition (HAR) systems that rely on cloud-based processing or heavy on-device models, our method leverages lightweight deep neural networks, personalized transfer learning, and signal augmentation techniques to perform low-latency and energy-efficient inference directly on microcontroller-class devices. The system is designed to recognize a set of critical sports instructions (e.g., “Start Running,” “Jump,” and “Sprint”) in mobile or outdoor environments using only wearable inertial sensors. Extensive experiments demonstrate our method outperforms several state-of-the-art baselines in accuracy (95.8%), model size (14.5 KB), and energy efficiency (0.82 mJ per inference). Compared to prior wearable HAR systems, our method uniquely integrates motion-aware segmentation and user-personalized few-shot adaptation, resulting in a 5.3% accuracy gain and 4× model compression over baseline TinyML frameworks. The proposed method provides an effective balance between model accuracy, generalization, and hardware efficiency, even in scenarios with significant motion noise and environmental variability.