Thiago Teixeira, Deokwoo Jung, G. Dublon, A. Savvides
{"title":"Recognizing activities from context and arm pose using finite state machines","authors":"Thiago Teixeira, Deokwoo Jung, G. Dublon, A. Savvides","doi":"10.1109/ICDSC.2009.5289370","DOIUrl":null,"url":null,"abstract":"We present an activity-recognition system for assisted living applications and smart homes. While existing systems tend to rely on expensive computation of comparatively largedimension data sets, ours leverages information from a small number of fundamentally different sensor measurements that provide context information pertaining the person's location, and action information by observing the motion of the body and arms. Camera nodes are placed on the ceiling to track people in the environment, and place them in the context of a building map where areas and objects of interest are premarked. Additionally, a single inertial sensor node is placed on the subject's arm to infer arm pose, heading and motion frequency using an accelerometer, gyroscope and magnetometer. These four measurements are parsed using a lightweight hierarchy of finite state machines, yielding recognition rates with high precision and recall values (0.92 and 0.93, respectively).","PeriodicalId":324810,"journal":{"name":"2009 Third ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 Third ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDSC.2009.5289370","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 21
Abstract
We present an activity-recognition system for assisted living applications and smart homes. While existing systems tend to rely on expensive computation of comparatively largedimension data sets, ours leverages information from a small number of fundamentally different sensor measurements that provide context information pertaining the person's location, and action information by observing the motion of the body and arms. Camera nodes are placed on the ceiling to track people in the environment, and place them in the context of a building map where areas and objects of interest are premarked. Additionally, a single inertial sensor node is placed on the subject's arm to infer arm pose, heading and motion frequency using an accelerometer, gyroscope and magnetometer. These four measurements are parsed using a lightweight hierarchy of finite state machines, yielding recognition rates with high precision and recall values (0.92 and 0.93, respectively).