Fausto Giunchiglia, M. Zeni, Enrico Bignotti, Wanyi Zhang
{"title":"评估野外标注一致性","authors":"Fausto Giunchiglia, M. Zeni, Enrico Bignotti, Wanyi Zhang","doi":"10.1109/PERCOMW.2018.8480236","DOIUrl":null,"url":null,"abstract":"The process of human annotation of sensor data is at the base of research areas such as participatory sensing and mobile crowdsensing. While much research has been devoted to assessing the quality of sensor data, the same cannot be said about annotations, which are fundamental to obtain a clear understanding of users experience. We present an evaluation of an interdisciplinary annotation methodology allowing users to continuously annotate their everyday life. The evaluation is done on a dataset from a project focused on the behaviour of students and how this impacts on their academic performance. We focus on those annotations concerning locations and movements of students, and we evaluate the annotations quality by checking their consistency. Results show that students are highly consistent with respect to the random baseline, and that these results can be improved by exploiting the semantics of annotations.","PeriodicalId":190096,"journal":{"name":"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Assessing Annotation Consistency in the Wild\",\"authors\":\"Fausto Giunchiglia, M. Zeni, Enrico Bignotti, Wanyi Zhang\",\"doi\":\"10.1109/PERCOMW.2018.8480236\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The process of human annotation of sensor data is at the base of research areas such as participatory sensing and mobile crowdsensing. While much research has been devoted to assessing the quality of sensor data, the same cannot be said about annotations, which are fundamental to obtain a clear understanding of users experience. We present an evaluation of an interdisciplinary annotation methodology allowing users to continuously annotate their everyday life. The evaluation is done on a dataset from a project focused on the behaviour of students and how this impacts on their academic performance. We focus on those annotations concerning locations and movements of students, and we evaluate the annotations quality by checking their consistency. Results show that students are highly consistent with respect to the random baseline, and that these results can be improved by exploiting the semantics of annotations.\",\"PeriodicalId\":190096,\"journal\":{\"name\":\"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)\",\"volume\":\"40 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-03-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/PERCOMW.2018.8480236\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PERCOMW.2018.8480236","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The process of human annotation of sensor data is at the base of research areas such as participatory sensing and mobile crowdsensing. While much research has been devoted to assessing the quality of sensor data, the same cannot be said about annotations, which are fundamental to obtain a clear understanding of users experience. We present an evaluation of an interdisciplinary annotation methodology allowing users to continuously annotate their everyday life. The evaluation is done on a dataset from a project focused on the behaviour of students and how this impacts on their academic performance. We focus on those annotations concerning locations and movements of students, and we evaluate the annotations quality by checking their consistency. Results show that students are highly consistent with respect to the random baseline, and that these results can be improved by exploiting the semantics of annotations.