{"title":"Inconsistent rating scales decrease social influence bias and enhance crowd wisdom","authors":"Pedro Aceves, Cassandra R. Chambers","doi":"10.1016/j.chb.2024.108497","DOIUrl":null,"url":null,"abstract":"<div><div>Online ratings for products and services pervade society. Research on the wisdom-of-the-crowd phenomenon suggests that the average of individuals' ratings should reflect true underlying quality. Online platforms that collect ratings, however, often do so while displaying the crowd's average rating, creating the potential for social influence. This is problematic because the wisdom-of-the-crowd effect relies on the aggregation of independent evaluative judgments. How can platforms limit social influence bias and enhance crowd wisdom? We argue that the structure of the rating scales used to record individual evaluations can alter the degree of social influence bias in online rating platforms. Through an analysis of rating websites with over 4.4 million ratings from over 60,000 evaluators and an online experiment (200 participants), we show that by varying the rating scales that record individual evaluations, platforms can decrease the presence of social influence, better maintaining the independent judgments that are necessary for unbiased crowd wisdom. Understanding how to limit social influence bias in online ratings stands to improve quality judgments across large swaths of economic and social life.</div></div>","PeriodicalId":48471,"journal":{"name":"Computers in Human Behavior","volume":"164 ","pages":"Article 108497"},"PeriodicalIF":9.0000,"publicationDate":"2024-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers in Human Behavior","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0747563224003650","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0
Abstract
Online ratings for products and services pervade society. Research on the wisdom-of-the-crowd phenomenon suggests that the average of individuals' ratings should reflect true underlying quality. Online platforms that collect ratings, however, often do so while displaying the crowd's average rating, creating the potential for social influence. This is problematic because the wisdom-of-the-crowd effect relies on the aggregation of independent evaluative judgments. How can platforms limit social influence bias and enhance crowd wisdom? We argue that the structure of the rating scales used to record individual evaluations can alter the degree of social influence bias in online rating platforms. Through an analysis of rating websites with over 4.4 million ratings from over 60,000 evaluators and an online experiment (200 participants), we show that by varying the rating scales that record individual evaluations, platforms can decrease the presence of social influence, better maintaining the independent judgments that are necessary for unbiased crowd wisdom. Understanding how to limit social influence bias in online ratings stands to improve quality judgments across large swaths of economic and social life.
期刊介绍:
Computers in Human Behavior is a scholarly journal that explores the psychological aspects of computer use. It covers original theoretical works, research reports, literature reviews, and software and book reviews. The journal examines both the use of computers in psychology, psychiatry, and related fields, and the psychological impact of computer use on individuals, groups, and society. Articles discuss topics such as professional practice, training, research, human development, learning, cognition, personality, and social interactions. It focuses on human interactions with computers, considering the computer as a medium through which human behaviors are shaped and expressed. Professionals interested in the psychological aspects of computer use will find this journal valuable, even with limited knowledge of computers.