{"title":"CredibleWeb: a platform for web credibility evaluation","authors":"Zhicong Huang, Alexandra Olteanu, K. Aberer","doi":"10.1145/2468356.2468694","DOIUrl":null,"url":null,"abstract":"The web content is the main source of information for many users. However, due to the open nature of today's web anyone can produce and publish content, which, as a result, is not always reliable. As such, mechanisms to evaluate the web content credibility are needed. In this paper, we describe CredibleWeb, a prototype crowdsourcing platform for web content evaluation with a two-fold goal: (1) to build a social enhanced and large scale dataset of credibility labeled web pages that enables the evaluation of different strategies for web credibility prediction, and (2) to investigate how various design elements are useful in engaging users to actively evaluate web pages credibility. We outline the challenges related with the design of a crowdsourcing platform for web credibility evaluation and describe our initial efforts.","PeriodicalId":228717,"journal":{"name":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","volume":"164 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"CHI '13 Extended Abstracts on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2468356.2468694","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
The web content is the main source of information for many users. However, due to the open nature of today's web anyone can produce and publish content, which, as a result, is not always reliable. As such, mechanisms to evaluate the web content credibility are needed. In this paper, we describe CredibleWeb, a prototype crowdsourcing platform for web content evaluation with a two-fold goal: (1) to build a social enhanced and large scale dataset of credibility labeled web pages that enables the evaluation of different strategies for web credibility prediction, and (2) to investigate how various design elements are useful in engaging users to actively evaluate web pages credibility. We outline the challenges related with the design of a crowdsourcing platform for web credibility evaluation and describe our initial efforts.