{"title":"高效的口袋检测与手机","authors":"Jun Yang, Emmanuel Munguia Tapia, S. Gibbs","doi":"10.1145/2494091.2494099","DOIUrl":null,"url":null,"abstract":"In this demonstration paper, we show a novel approach to detect the common placements of a mobile phone, such as \"in pocket\", \"in bag\" or \"out of pocket or bag\", from embedded proximity (IR) and light sensors. We use sensor data fusion and pattern recognition to extract distinct features from sensor signals and classify the boundaries among these three phone placement contexts. The detection results are demonstrated on a Samsung Tizen mobile phone.","PeriodicalId":220524,"journal":{"name":"Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"27","resultStr":"{\"title\":\"Efficient in-pocket detection with mobile phones\",\"authors\":\"Jun Yang, Emmanuel Munguia Tapia, S. Gibbs\",\"doi\":\"10.1145/2494091.2494099\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this demonstration paper, we show a novel approach to detect the common placements of a mobile phone, such as \\\"in pocket\\\", \\\"in bag\\\" or \\\"out of pocket or bag\\\", from embedded proximity (IR) and light sensors. We use sensor data fusion and pattern recognition to extract distinct features from sensor signals and classify the boundaries among these three phone placement contexts. The detection results are demonstrated on a Samsung Tizen mobile phone.\",\"PeriodicalId\":220524,\"journal\":{\"name\":\"Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-09-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"27\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2494091.2494099\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2494091.2494099","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
In this demonstration paper, we show a novel approach to detect the common placements of a mobile phone, such as "in pocket", "in bag" or "out of pocket or bag", from embedded proximity (IR) and light sensors. We use sensor data fusion and pattern recognition to extract distinct features from sensor signals and classify the boundaries among these three phone placement contexts. The detection results are demonstrated on a Samsung Tizen mobile phone.