Riya Samanta, Bidyut Saha, Soumya K. Ghosh, Ram Babu Roy
{"title":"Optimizing TinyML: The Impact of Reduced Data Acquisition Rates for Time Series Classification on Microcontrollers","authors":"Riya Samanta, Bidyut Saha, Soumya K. Ghosh, Ram Babu Roy","doi":"arxiv-2409.10942","DOIUrl":null,"url":null,"abstract":"Tiny Machine Learning (TinyML) enables efficient, lowcost, and privacy\npreserving machine learning inference directly on microcontroller units (MCUs)\nconnected to sensors. Optimizing models for these constrained environments is\ncrucial. This paper investigates how reducing data acquisition rates affects\nTinyML models for time series classification, focusing on resource-constrained,\nbattery operated IoT devices. By lowering data sampling frequency, we aim to\nreduce computational demands RAM usage, energy consumption, latency, and MAC\noperations by approximately fourfold while maintaining similar classification\naccuracies. Our experiments with six benchmark datasets (UCIHAR, WISDM, PAMAP2,\nMHEALTH, MITBIH, and PTB) showed that reducing data acquisition rates\nsignificantly cut energy consumption and computational load, with minimal\naccuracy loss. For example, a 75\\% reduction in acquisition rate for MITBIH and\nPTB datasets led to a 60\\% decrease in RAM usage, 75\\% reduction in MAC\noperations, 74\\% decrease in latency, and 70\\% reduction in energy consumption,\nwithout accuracy loss. These results offer valuable insights for deploying\nefficient TinyML models in constrained environments.","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.10942","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Tiny Machine Learning (TinyML) enables efficient, lowcost, and privacy
preserving machine learning inference directly on microcontroller units (MCUs)
connected to sensors. Optimizing models for these constrained environments is
crucial. This paper investigates how reducing data acquisition rates affects
TinyML models for time series classification, focusing on resource-constrained,
battery operated IoT devices. By lowering data sampling frequency, we aim to
reduce computational demands RAM usage, energy consumption, latency, and MAC
operations by approximately fourfold while maintaining similar classification
accuracies. Our experiments with six benchmark datasets (UCIHAR, WISDM, PAMAP2,
MHEALTH, MITBIH, and PTB) showed that reducing data acquisition rates
significantly cut energy consumption and computational load, with minimal
accuracy loss. For example, a 75\% reduction in acquisition rate for MITBIH and
PTB datasets led to a 60\% decrease in RAM usage, 75\% reduction in MAC
operations, 74\% decrease in latency, and 70\% reduction in energy consumption,
without accuracy loss. These results offer valuable insights for deploying
efficient TinyML models in constrained environments.