{"title":"Multi-scale Conditional Random Fields for first-person activity recognition","authors":"Kai Zhan, S. Faux, F. Ramos","doi":"10.1109/PerCom.2014.6813944","DOIUrl":null,"url":null,"abstract":"We propose a novel pervasive system to recognise human daily activities from a wearable device. The system is designed in a form of reading glasses, named `Smart Glasses', integrating a 3-axis accelerometer and a first-person view camera. Our aim is to classify user's activities of daily living (ADLs) based on both vision and head motion data. This ego-activity recognition system not only allows caretakers to track on a specific person (such as patient or elderly people), but also has the potential to remind/warn people with cognitive impairments of hazardous situations. We present the following contributions in this paper: a feature extraction method from accelerometer and video; a classification algorithm integrating both locomotive (body motions) and stationary activities (without or with small motions); a novel multi-scale dynamic graphical model structure for structured classification over time. We collect, train and validate our system on a large dataset containing 20 hours of ADLs data, including 12 daily activities under different environmental settings. Our method improves the classification performance (F-Score) of conventional approaches from 43.32%(video features) and 66.02%(acceleration features) by an average of 20-40% to 84.45%, with an overall accuracy of 90.04% in realistic ADLs.","PeriodicalId":263520,"journal":{"name":"2014 IEEE International Conference on Pervasive Computing and Communications (PerCom)","volume":"125 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"91","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE International Conference on Pervasive Computing and Communications (PerCom)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PerCom.2014.6813944","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 91
Abstract
We propose a novel pervasive system to recognise human daily activities from a wearable device. The system is designed in a form of reading glasses, named `Smart Glasses', integrating a 3-axis accelerometer and a first-person view camera. Our aim is to classify user's activities of daily living (ADLs) based on both vision and head motion data. This ego-activity recognition system not only allows caretakers to track on a specific person (such as patient or elderly people), but also has the potential to remind/warn people with cognitive impairments of hazardous situations. We present the following contributions in this paper: a feature extraction method from accelerometer and video; a classification algorithm integrating both locomotive (body motions) and stationary activities (without or with small motions); a novel multi-scale dynamic graphical model structure for structured classification over time. We collect, train and validate our system on a large dataset containing 20 hours of ADLs data, including 12 daily activities under different environmental settings. Our method improves the classification performance (F-Score) of conventional approaches from 43.32%(video features) and 66.02%(acceleration features) by an average of 20-40% to 84.45%, with an overall accuracy of 90.04% in realistic ADLs.