{"title":"Evaluate action primitives for human activity recognition using unsupervised learning approach","authors":"Luis F. Mejia-Ricart, Paul Helling, Aspen Olmsted","doi":"10.23919/ICITST.2017.8356374","DOIUrl":null,"url":null,"abstract":"Smartphones and wearable devices are in the frontlines when it comes to the field of Human Activity Recognition (HAR). There have been numerous attempts to use motion sensors in smartphones and wearables to recognize human activity. Most of these studies apply supervised learning techniques, which requires them to use labeled datasets. In this work, we take a sample of these labels, or action primitives (sit, stand, run, walk, jump, lie down), and evaluate them against the resulting labels of several clustering algorithms. We built two datasets (labeled and unlabeled) using accelerometer, gyroscope, and pedometer readings from two fixed-position devices, a smartphone in the side pocket, and a smartwatch strapped onto the left-hand wrist. Ultimately, we want to determine whether these action primitives commonly used in HAR are optimal, and suggest a better set of primitives if not.","PeriodicalId":440665,"journal":{"name":"2017 12th International Conference for Internet Technology and Secured Transactions (ICITST)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 12th International Conference for Internet Technology and Secured Transactions (ICITST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/ICITST.2017.8356374","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15
Abstract
Smartphones and wearable devices are in the frontlines when it comes to the field of Human Activity Recognition (HAR). There have been numerous attempts to use motion sensors in smartphones and wearables to recognize human activity. Most of these studies apply supervised learning techniques, which requires them to use labeled datasets. In this work, we take a sample of these labels, or action primitives (sit, stand, run, walk, jump, lie down), and evaluate them against the resulting labels of several clustering algorithms. We built two datasets (labeled and unlabeled) using accelerometer, gyroscope, and pedometer readings from two fixed-position devices, a smartphone in the side pocket, and a smartwatch strapped onto the left-hand wrist. Ultimately, we want to determine whether these action primitives commonly used in HAR are optimal, and suggest a better set of primitives if not.