Joseph Korpela, Kazuyuki Takase, T. Hirashima, T. Maekawa, Julien Eberle, D. Chakraborty, K. Aberer
{"title":"一种能量感知方法,用于使用可穿戴传感器联合识别活动和手势","authors":"Joseph Korpela, Kazuyuki Takase, T. Hirashima, T. Maekawa, Julien Eberle, D. Chakraborty, K. Aberer","doi":"10.1145/2802083.2808400","DOIUrl":null,"url":null,"abstract":"This paper presents an energy-aware method for recognizing time series acceleration data containing both activities and gestures using a wearable device coupled with a smartphone. In our method, we use a small wearable device to collect accelerometer data from a user's wrist, recognizing each data segment using a minimal feature set chosen automatically for that segment. For each collected data segment, if our model finds that recognizing the segment requires high-cost features that the wearable device cannot extract, such as dynamic time warping for gesture recognition, then the segment is transmitted to the smartphone where the high-cost features are extracted and recognition is performed. Otherwise, only the minimum required set of low-cost features are extracted from the segment on the wearable device and only the recognition result, i.e., label, is transmitted to the smartphone in place of the raw data, reducing transmission costs. Our method automatically constructs this adaptive processing pipeline solely from training data.","PeriodicalId":372395,"journal":{"name":"Proceedings of the 2015 ACM International Symposium on Wearable Computers","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-09-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"28","resultStr":"{\"title\":\"An energy-aware method for the joint recognition of activities and gestures using wearable sensors\",\"authors\":\"Joseph Korpela, Kazuyuki Takase, T. Hirashima, T. Maekawa, Julien Eberle, D. Chakraborty, K. Aberer\",\"doi\":\"10.1145/2802083.2808400\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents an energy-aware method for recognizing time series acceleration data containing both activities and gestures using a wearable device coupled with a smartphone. In our method, we use a small wearable device to collect accelerometer data from a user's wrist, recognizing each data segment using a minimal feature set chosen automatically for that segment. For each collected data segment, if our model finds that recognizing the segment requires high-cost features that the wearable device cannot extract, such as dynamic time warping for gesture recognition, then the segment is transmitted to the smartphone where the high-cost features are extracted and recognition is performed. Otherwise, only the minimum required set of low-cost features are extracted from the segment on the wearable device and only the recognition result, i.e., label, is transmitted to the smartphone in place of the raw data, reducing transmission costs. Our method automatically constructs this adaptive processing pipeline solely from training data.\",\"PeriodicalId\":372395,\"journal\":{\"name\":\"Proceedings of the 2015 ACM International Symposium on Wearable Computers\",\"volume\":\"4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-09-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"28\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2015 ACM International Symposium on Wearable Computers\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2802083.2808400\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2015 ACM International Symposium on Wearable Computers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2802083.2808400","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An energy-aware method for the joint recognition of activities and gestures using wearable sensors
This paper presents an energy-aware method for recognizing time series acceleration data containing both activities and gestures using a wearable device coupled with a smartphone. In our method, we use a small wearable device to collect accelerometer data from a user's wrist, recognizing each data segment using a minimal feature set chosen automatically for that segment. For each collected data segment, if our model finds that recognizing the segment requires high-cost features that the wearable device cannot extract, such as dynamic time warping for gesture recognition, then the segment is transmitted to the smartphone where the high-cost features are extracted and recognition is performed. Otherwise, only the minimum required set of low-cost features are extracted from the segment on the wearable device and only the recognition result, i.e., label, is transmitted to the smartphone in place of the raw data, reducing transmission costs. Our method automatically constructs this adaptive processing pipeline solely from training data.