Chulhong Min, Chanyou Hwang, Taiwoo Park, Yuhwan Kim, Uichin Lee, Inseok Hwang, Chungkuk Yoo, Changhoon Lee, Younghyun Ju, Junehwa Song, Jaeung Lee, Miri Moon, Haechan Lee, Youngki Lee
{"title":"ACM HotMobile 2013演示:将现场社会意识带入移动系统:日常交互监测及其应用","authors":"Chulhong Min, Chanyou Hwang, Taiwoo Park, Yuhwan Kim, Uichin Lee, Inseok Hwang, Chungkuk Yoo, Changhoon Lee, Younghyun Ju, Junehwa Song, Jaeung Lee, Miri Moon, Haechan Lee, Youngki Lee","doi":"10.1145/2542095.2542101","DOIUrl":null,"url":null,"abstract":"Does our smartphone help at a variety of social gatherings in our everyday life, for instance, having dinner with family and meeting friends? For a few recent years, smartphones have been rapidly penetrating to our everyday lives. Yet, it is still at an early dawn that the smartphone applications and systems are closely immersed into everyday social activities. We share so many moments and activities with other people right here, right in front of us, and so will smartphones [4]. We argue that, many, in-situ co-presenting smartphones serve as a newly emerging substrate to accommodate whole new in-situ social applications. These applications have huge opportunity in every facet in our daily lives, e.g., providing new user experiences or facilitating social interactions during shared social activities. They could also take advantage of the larger, more capable union of computing devices and resources. In this demo, we introduce a novel initiative toward everyday face-to-face interaction monitoring system. Among diverse verbal, aural, visual cues expressed during face-to-face interaction, we first focus on capturing diverse meta-linguistic information from","PeriodicalId":43578,"journal":{"name":"Mobile Computing and Communications Review","volume":"165 1","pages":"9-10"},"PeriodicalIF":0.0000,"publicationDate":"2013-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ACM HotMobile 2013 demo: bringing in-situ social awareness to mobile systems: everyday interaction monitoring and its applications\",\"authors\":\"Chulhong Min, Chanyou Hwang, Taiwoo Park, Yuhwan Kim, Uichin Lee, Inseok Hwang, Chungkuk Yoo, Changhoon Lee, Younghyun Ju, Junehwa Song, Jaeung Lee, Miri Moon, Haechan Lee, Youngki Lee\",\"doi\":\"10.1145/2542095.2542101\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Does our smartphone help at a variety of social gatherings in our everyday life, for instance, having dinner with family and meeting friends? For a few recent years, smartphones have been rapidly penetrating to our everyday lives. Yet, it is still at an early dawn that the smartphone applications and systems are closely immersed into everyday social activities. We share so many moments and activities with other people right here, right in front of us, and so will smartphones [4]. We argue that, many, in-situ co-presenting smartphones serve as a newly emerging substrate to accommodate whole new in-situ social applications. These applications have huge opportunity in every facet in our daily lives, e.g., providing new user experiences or facilitating social interactions during shared social activities. They could also take advantage of the larger, more capable union of computing devices and resources. In this demo, we introduce a novel initiative toward everyday face-to-face interaction monitoring system. Among diverse verbal, aural, visual cues expressed during face-to-face interaction, we first focus on capturing diverse meta-linguistic information from\",\"PeriodicalId\":43578,\"journal\":{\"name\":\"Mobile Computing and Communications Review\",\"volume\":\"165 1\",\"pages\":\"9-10\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-11-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Mobile Computing and Communications Review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2542095.2542101\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mobile Computing and Communications Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2542095.2542101","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
ACM HotMobile 2013 demo: bringing in-situ social awareness to mobile systems: everyday interaction monitoring and its applications
Does our smartphone help at a variety of social gatherings in our everyday life, for instance, having dinner with family and meeting friends? For a few recent years, smartphones have been rapidly penetrating to our everyday lives. Yet, it is still at an early dawn that the smartphone applications and systems are closely immersed into everyday social activities. We share so many moments and activities with other people right here, right in front of us, and so will smartphones [4]. We argue that, many, in-situ co-presenting smartphones serve as a newly emerging substrate to accommodate whole new in-situ social applications. These applications have huge opportunity in every facet in our daily lives, e.g., providing new user experiences or facilitating social interactions during shared social activities. They could also take advantage of the larger, more capable union of computing devices and resources. In this demo, we introduce a novel initiative toward everyday face-to-face interaction monitoring system. Among diverse verbal, aural, visual cues expressed during face-to-face interaction, we first focus on capturing diverse meta-linguistic information from