Mina Sakamura, Takuro Yonezawa, J. Nakazawa, K. Takashio, H. Tokuda
{"title":"帮帮我!用物理传感器评估和可视化参与式感知任务","authors":"Mina Sakamura, Takuro Yonezawa, J. Nakazawa, K. Takashio, H. Tokuda","doi":"10.1145/2637064.2637095","DOIUrl":null,"url":null,"abstract":"Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.","PeriodicalId":239987,"journal":{"name":"Proceedings of the 2014 International Workshop on Web Intelligence and Smart Sensing","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Help Me!: Valuing and Visualizing Participatory Sensing Tasks with Physical Sensors\",\"authors\":\"Mina Sakamura, Takuro Yonezawa, J. Nakazawa, K. Takashio, H. Tokuda\",\"doi\":\"10.1145/2637064.2637095\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.\",\"PeriodicalId\":239987,\"journal\":{\"name\":\"Proceedings of the 2014 International Workshop on Web Intelligence and Smart Sensing\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2014 International Workshop on Web Intelligence and Smart Sensing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2637064.2637095\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2014 International Workshop on Web Intelligence and Smart Sensing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2637064.2637095","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Help Me!: Valuing and Visualizing Participatory Sensing Tasks with Physical Sensors
Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.