Optimizing TinyML: The Impact of Reduced Data Acquisition Rates for Time Series Classification on Microcontrollers

Riya Samanta, Bidyut Saha, Soumya K. Ghosh, Ram Babu Roy
{"title":"Optimizing TinyML: The Impact of Reduced Data Acquisition Rates for Time Series Classification on Microcontrollers","authors":"Riya Samanta, Bidyut Saha, Soumya K. Ghosh, Ram Babu Roy","doi":"arxiv-2409.10942","DOIUrl":null,"url":null,"abstract":"Tiny Machine Learning (TinyML) enables efficient, lowcost, and privacy\npreserving machine learning inference directly on microcontroller units (MCUs)\nconnected to sensors. Optimizing models for these constrained environments is\ncrucial. This paper investigates how reducing data acquisition rates affects\nTinyML models for time series classification, focusing on resource-constrained,\nbattery operated IoT devices. By lowering data sampling frequency, we aim to\nreduce computational demands RAM usage, energy consumption, latency, and MAC\noperations by approximately fourfold while maintaining similar classification\naccuracies. Our experiments with six benchmark datasets (UCIHAR, WISDM, PAMAP2,\nMHEALTH, MITBIH, and PTB) showed that reducing data acquisition rates\nsignificantly cut energy consumption and computational load, with minimal\naccuracy loss. For example, a 75\\% reduction in acquisition rate for MITBIH and\nPTB datasets led to a 60\\% decrease in RAM usage, 75\\% reduction in MAC\noperations, 74\\% decrease in latency, and 70\\% reduction in energy consumption,\nwithout accuracy loss. These results offer valuable insights for deploying\nefficient TinyML models in constrained environments.","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.10942","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Tiny Machine Learning (TinyML) enables efficient, lowcost, and privacy preserving machine learning inference directly on microcontroller units (MCUs) connected to sensors. Optimizing models for these constrained environments is crucial. This paper investigates how reducing data acquisition rates affects TinyML models for time series classification, focusing on resource-constrained, battery operated IoT devices. By lowering data sampling frequency, we aim to reduce computational demands RAM usage, energy consumption, latency, and MAC operations by approximately fourfold while maintaining similar classification accuracies. Our experiments with six benchmark datasets (UCIHAR, WISDM, PAMAP2, MHEALTH, MITBIH, and PTB) showed that reducing data acquisition rates significantly cut energy consumption and computational load, with minimal accuracy loss. For example, a 75\% reduction in acquisition rate for MITBIH and PTB datasets led to a 60\% decrease in RAM usage, 75\% reduction in MAC operations, 74\% decrease in latency, and 70\% reduction in energy consumption, without accuracy loss. These results offer valuable insights for deploying efficient TinyML models in constrained environments.
优化 TinyML:降低数据采集速率对微控制器时间序列分类的影响
微型机器学习(TinyML)可直接在与传感器相连的微控制器单元(MCU)上实现高效、低成本和保护隐私的机器学习推理。针对这些受限环境优化模型至关重要。本文研究了降低数据采集率如何影响用于时间序列分类的 TinyML 模型,重点关注资源受限、电池供电的物联网设备。通过降低数据采样频率,我们旨在将计算需求、内存使用、能耗、延迟和 MAC 操作降低约四倍,同时保持类似的分类精度。我们用六个基准数据集(UCIHAR、WISDM、PAMAP2、MHEALTH、MITBIH 和 PTB)进行的实验表明,降低数据采集频率可显著降低能耗和计算负荷,同时将精度损失降到最低。例如,MITBIH和PTB数据集的采集率降低了75%,RAM使用率降低了60%,MAC操作降低了75%,延迟降低了74%,能耗降低了70%,而准确度却没有降低。这些结果为在受限环境中部署高效的 TinyML 模型提供了宝贵的启示。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信