{"title":"Classifying social actions with a single accelerometer","authors":"H. Hung, G. Englebienne, J. Kools","doi":"10.1145/2493432.2493513","DOIUrl":null,"url":null,"abstract":"In this paper, we estimate different types of social actions from a single body-worn accelerometer in a crowded social setting. Accelerometers have many advantages in such settings: they are impervious to environmental noise, unobtrusive, cheap, low-powered, and their readings are specific to a single person. Our experiments show that they are surprisingly informative of different types of social actions. The social actions we address in this paper are whether a person is speaking, laughing, gesturing, drinking, or stepping. To our knowledge, this is the first work to carry out experiments on estimating social actions from conversational behavior using only a wearable accelerometer. The ability to estimate such actions using just the acceleration opens up the potential for analyzing more about social aspects of people's interactions without explicitly recording what they are saying.","PeriodicalId":262104,"journal":{"name":"Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing","volume":"111 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"55","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2493432.2493513","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 55
Abstract
In this paper, we estimate different types of social actions from a single body-worn accelerometer in a crowded social setting. Accelerometers have many advantages in such settings: they are impervious to environmental noise, unobtrusive, cheap, low-powered, and their readings are specific to a single person. Our experiments show that they are surprisingly informative of different types of social actions. The social actions we address in this paper are whether a person is speaking, laughing, gesturing, drinking, or stepping. To our knowledge, this is the first work to carry out experiments on estimating social actions from conversational behavior using only a wearable accelerometer. The ability to estimate such actions using just the acceleration opens up the potential for analyzing more about social aspects of people's interactions without explicitly recording what they are saying.