{"title":"On strategies for budget-based online annotation in human activity recognition","authors":"Tudor Miu, P. Missier, D. Roggen, T. Plötz","doi":"10.1145/2638728.2641300","DOIUrl":null,"url":null,"abstract":"Bootstrapping activity recognition systems in ubiquitous and mobile computing scenarios often comes with the challenge of obtaining reliable ground truth annotations. A promising approach to overcome these difficulties involves obtaining online activity annotations directly from users. However, such direct engagement has its limitations as users typically show only limited tolerance for unwanted interruptions such as prompts for annotations. In this paper we explore the effectiveness of approaches to online, user-based annotation of activity data. Our central assumption is the existence of a fixed, limited budget of annotations a user is willing to provide. We evaluate different strategies on how to spend such a budget most effectively. Using the Opportunity benchmark we simulate online annotation scenarios for a variety of budget configurations and we show that effective online annotation can still be achieved using reduced annotation effort.","PeriodicalId":20496,"journal":{"name":"Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2014-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2638728.2641300","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Bootstrapping activity recognition systems in ubiquitous and mobile computing scenarios often comes with the challenge of obtaining reliable ground truth annotations. A promising approach to overcome these difficulties involves obtaining online activity annotations directly from users. However, such direct engagement has its limitations as users typically show only limited tolerance for unwanted interruptions such as prompts for annotations. In this paper we explore the effectiveness of approaches to online, user-based annotation of activity data. Our central assumption is the existence of a fixed, limited budget of annotations a user is willing to provide. We evaluate different strategies on how to spend such a budget most effectively. Using the Opportunity benchmark we simulate online annotation scenarios for a variety of budget configurations and we show that effective online annotation can still be achieved using reduced annotation effort.