J. Redi, E. Siahaan, Pavel Korshunov, Julian Habigt, T. Hossfeld
{"title":"When the Crowd Challenges the Lab: Lessons Learnt from Subjective Studies on Image Aesthetic Appeal","authors":"J. Redi, E. Siahaan, Pavel Korshunov, Julian Habigt, T. Hossfeld","doi":"10.1145/2810188.2810194","DOIUrl":null,"url":null,"abstract":"Crowdsourcing gives researchers the opportunity to collect subjective data quickly, in the real-world, and from a very diverse pool of users. In a long-term study on image aesthetic appeal, we challenged the crowdsourced assessments with typical lab methodologies in order to identify and analyze the impact of crowdsourcing environment on the reliability of subjective data. We identified and conducted three types of crowdsourcing experiments that helped us perform an in-depth analysis of factors influencing reliability and reproducibility of results in uncontrolled crowdsourcing environments. We provide a generalized summary of lessons learnt for future research studies which will try to port lab-based evaluation methodologies into crowdsourcing, so that they can avoid the typical pitfalls in design and analysis of crowdsourcing experiments.","PeriodicalId":284531,"journal":{"name":"Proceedings of the Fourth International Workshop on Crowdsourcing for Multimedia","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Fourth International Workshop on Crowdsourcing for Multimedia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2810188.2810194","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16
Abstract
Crowdsourcing gives researchers the opportunity to collect subjective data quickly, in the real-world, and from a very diverse pool of users. In a long-term study on image aesthetic appeal, we challenged the crowdsourced assessments with typical lab methodologies in order to identify and analyze the impact of crowdsourcing environment on the reliability of subjective data. We identified and conducted three types of crowdsourcing experiments that helped us perform an in-depth analysis of factors influencing reliability and reproducibility of results in uncontrolled crowdsourcing environments. We provide a generalized summary of lessons learnt for future research studies which will try to port lab-based evaluation methodologies into crowdsourcing, so that they can avoid the typical pitfalls in design and analysis of crowdsourcing experiments.