{"title":"A screening methodology for crowdsourcing video QoE evaluation","authors":"Louis Anegekuh, Lingfen Sun, E. Ifeachor","doi":"10.1109/GLOCOM.2014.7036964","DOIUrl":null,"url":null,"abstract":"Recently, crowdsourcing has emerged as a cheaper and quicker alternative to traditional laboratory based Quality of Experience (QoE) evaluation for video streaming services. Crowdsourcing is a process of recruiting anonymous members of the public to solve/perform different tasks without supervision. This approach of seeking solutions from the public has been enhanced by ubiquitous internet connectivity, low cost and the ease to recruit workers without geographical restrictions. Although crowdsourcing makes it possible to save cost and reach a large group of people to perform subjective video quality testing, challenges such as validity of results and the trustworthiness of workers still remain unresolved. In this paper, we attempt to address some of these challenges by developing a screening algorithm that is able to determine the validity and trustworthiness of crowdsourcing video QoE evaluation results. This algorithm is based on evaluator's extracted data (e.g. Network IP addresses, network devices and browser information, time spent on the crowdsourcing platform, date and grading patterns). To determine the performance of this algorithm, we carried out a separate controlled laboratory based subjective video quality testing. This test enables us to determine how screened crowdsourcing results correlate with lab based results. Preliminary results show that crowdsourcing results can be improved by up to 59% when the proposed screening algorithm is applied. Moreover, the results support our assertion that crowdsourcing can provide reliable video QoE measurements if proper screening is performed on the test results.","PeriodicalId":6492,"journal":{"name":"2014 IEEE Global Communications Conference","volume":"8 1","pages":"1152-1157"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE Global Communications Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GLOCOM.2014.7036964","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
Abstract
Recently, crowdsourcing has emerged as a cheaper and quicker alternative to traditional laboratory based Quality of Experience (QoE) evaluation for video streaming services. Crowdsourcing is a process of recruiting anonymous members of the public to solve/perform different tasks without supervision. This approach of seeking solutions from the public has been enhanced by ubiquitous internet connectivity, low cost and the ease to recruit workers without geographical restrictions. Although crowdsourcing makes it possible to save cost and reach a large group of people to perform subjective video quality testing, challenges such as validity of results and the trustworthiness of workers still remain unresolved. In this paper, we attempt to address some of these challenges by developing a screening algorithm that is able to determine the validity and trustworthiness of crowdsourcing video QoE evaluation results. This algorithm is based on evaluator's extracted data (e.g. Network IP addresses, network devices and browser information, time spent on the crowdsourcing platform, date and grading patterns). To determine the performance of this algorithm, we carried out a separate controlled laboratory based subjective video quality testing. This test enables us to determine how screened crowdsourcing results correlate with lab based results. Preliminary results show that crowdsourcing results can be improved by up to 59% when the proposed screening algorithm is applied. Moreover, the results support our assertion that crowdsourcing can provide reliable video QoE measurements if proper screening is performed on the test results.