激励对软件评审质量影响的定量研究

Mingwei Tang, Zhiwei Xu, Yuhao Qin, Cui Su, Yi Zhu, Feifei Tao, Junhua Ding
{"title":"激励对软件评审质量影响的定量研究","authors":"Mingwei Tang, Zhiwei Xu, Yuhao Qin, Cui Su, Yi Zhu, Feifei Tao, Junhua Ding","doi":"10.1109/DSA56465.2022.00016","DOIUrl":null,"url":null,"abstract":"Online reviews like product reviews are important references for a potential user to learn the basic information of a product. However, some reviewers were rewarded for writing the reviews, which may impact the objectiveness of the reviews. But on the other hand, the reviews written by reviewers who weren't rewarded could be in low quality. Research study already showed incentivized reviewers may give a higher overall score than non-incentivized reviewers (or called organic) do, but does that pattern also apply to review content? In this paper, a quantitative comparison study is conducted to investigate the differences between the incentivized reviews and organic reviews of software products. Four pairs of comparison including overall score, sentiment preference, correlation, and similarity are performed by using statistical and text mining methods. The results show there is no statistically significant difference between the incentive reviews and organic reviews except the sentiment of total, “Problems and Benefits” and “Summary” part of a review text. The results are unexpected since the reviews collected from the website for reviewing software product already filtered low quality reviews. It demonstrates that the incentivized action might not be necessary to produce biased reviews and it may be an effective way to attract more reviews since the website include more than 75% incentivized reviews. The paper also analyzed the possible reasons from the feature of reviewers' position in a company, a review's indicator, and reviewers' common actions. Based on the analysis, this study suggests that a potential user may pay attention to some quality dimensions of a review to mitigate the bias risk from the reviews.","PeriodicalId":208148,"journal":{"name":"2022 9th International Conference on Dependable Systems and Their Applications (DSA)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A Quantitative Study of Impact of Incentive to Quality of Software Reviews\",\"authors\":\"Mingwei Tang, Zhiwei Xu, Yuhao Qin, Cui Su, Yi Zhu, Feifei Tao, Junhua Ding\",\"doi\":\"10.1109/DSA56465.2022.00016\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Online reviews like product reviews are important references for a potential user to learn the basic information of a product. However, some reviewers were rewarded for writing the reviews, which may impact the objectiveness of the reviews. But on the other hand, the reviews written by reviewers who weren't rewarded could be in low quality. Research study already showed incentivized reviewers may give a higher overall score than non-incentivized reviewers (or called organic) do, but does that pattern also apply to review content? In this paper, a quantitative comparison study is conducted to investigate the differences between the incentivized reviews and organic reviews of software products. Four pairs of comparison including overall score, sentiment preference, correlation, and similarity are performed by using statistical and text mining methods. The results show there is no statistically significant difference between the incentive reviews and organic reviews except the sentiment of total, “Problems and Benefits” and “Summary” part of a review text. The results are unexpected since the reviews collected from the website for reviewing software product already filtered low quality reviews. It demonstrates that the incentivized action might not be necessary to produce biased reviews and it may be an effective way to attract more reviews since the website include more than 75% incentivized reviews. The paper also analyzed the possible reasons from the feature of reviewers' position in a company, a review's indicator, and reviewers' common actions. Based on the analysis, this study suggests that a potential user may pay attention to some quality dimensions of a review to mitigate the bias risk from the reviews.\",\"PeriodicalId\":208148,\"journal\":{\"name\":\"2022 9th International Conference on Dependable Systems and Their Applications (DSA)\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 9th International Conference on Dependable Systems and Their Applications (DSA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DSA56465.2022.00016\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 9th International Conference on Dependable Systems and Their Applications (DSA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DSA56465.2022.00016","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

产品评论等在线评论是潜在用户了解产品基本信息的重要参考。然而,一些评论者因为撰写评论而获得奖励,这可能会影响评论的客观性。但另一方面,没有得到奖励的评论者写的评论可能质量很低。研究已经表明,受激励的评论者可能会比不受激励的评论者(或称为有机评论者)给出更高的总分,但这种模式是否也适用于评论内容?本文对软件产品的激励性评价和有机评价进行了定量比较研究。采用统计方法和文本挖掘方法进行总分、情感偏好、相关性和相似性四对比较。结果表明:激励性评价与有机评价之间,除评价文本的总情感、“问题与利益”情感和“总结”情感外,无统计学差异。结果是出乎意料的,因为从网站收集的用于审查软件产品的评论已经过滤了低质量的评论。这表明,激励行为可能不是产生偏见评论的必要条件,它可能是吸引更多评论的有效方法,因为网站包含超过75%的激励评论。本文还从审稿人在公司的职位特征、审稿人的指标、审稿人的常见行为等方面分析了可能的原因。基于分析,本研究建议潜在用户可能会注意评论的某些质量维度,以减轻评论的偏倚风险。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Quantitative Study of Impact of Incentive to Quality of Software Reviews
Online reviews like product reviews are important references for a potential user to learn the basic information of a product. However, some reviewers were rewarded for writing the reviews, which may impact the objectiveness of the reviews. But on the other hand, the reviews written by reviewers who weren't rewarded could be in low quality. Research study already showed incentivized reviewers may give a higher overall score than non-incentivized reviewers (or called organic) do, but does that pattern also apply to review content? In this paper, a quantitative comparison study is conducted to investigate the differences between the incentivized reviews and organic reviews of software products. Four pairs of comparison including overall score, sentiment preference, correlation, and similarity are performed by using statistical and text mining methods. The results show there is no statistically significant difference between the incentive reviews and organic reviews except the sentiment of total, “Problems and Benefits” and “Summary” part of a review text. The results are unexpected since the reviews collected from the website for reviewing software product already filtered low quality reviews. It demonstrates that the incentivized action might not be necessary to produce biased reviews and it may be an effective way to attract more reviews since the website include more than 75% incentivized reviews. The paper also analyzed the possible reasons from the feature of reviewers' position in a company, a review's indicator, and reviewers' common actions. Based on the analysis, this study suggests that a potential user may pay attention to some quality dimensions of a review to mitigate the bias risk from the reviews.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信