Jennifer Dykema, John Stevenson, Cameron P. Jones, Brendan F Day
{"title":"Guaranteed Incentives and Prize Drawings: Effects on Participation, Data Quality, and Costs in a Web Survey of College Students on Sensitive Topics","authors":"Jennifer Dykema, John Stevenson, Cameron P. Jones, Brendan F Day","doi":"10.1177/08944393231189853","DOIUrl":null,"url":null,"abstract":"Many studies rely on traditional web survey methods in which all contacts with sample members are through email and the questionnaire is administered exclusively online. Because it is difficult to effectively administer prepaid incentives via email, researchers frequently employ lotteries or prize draws as incentives even though their influence on survey participation is small. The current study examines whether a prize draw is more effective if it is divided into a few larger amounts versus several smaller amounts and compares prize draws to a small but guaranteed postpaid incentive. Data are from the 2019 Campus Climate Survey on Sexual Assault and Sexual Misconduct. Sample members include 38,434 undergraduate and graduate students at a large Midwestern university who were randomly assigned to receive: a guaranteed $5 Amazon gift card; entry into a high-payout drawing for one of four $500 prizes; or entry into a low-payout drawing for one of twenty $100 prizes. Results indicate the guaranteed incentive increased response rates, with no difference between the prize draws. While results from various data quality outcomes show the guaranteed incentive reduced break-off rates and the high-payout drawing increased item nonresponse, there were no differences across incentive conditions in rates of speeding, reporting of sensitive data, straightlining, or sample representativeness. As expected, the prize draws had much lower overall and per complete costs.","PeriodicalId":49509,"journal":{"name":"Social Science Computer Review","volume":" ","pages":""},"PeriodicalIF":3.0000,"publicationDate":"2023-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Social Science Computer Review","FirstCategoryId":"90","ListUrlMain":"https://doi.org/10.1177/08944393231189853","RegionNum":2,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Many studies rely on traditional web survey methods in which all contacts with sample members are through email and the questionnaire is administered exclusively online. Because it is difficult to effectively administer prepaid incentives via email, researchers frequently employ lotteries or prize draws as incentives even though their influence on survey participation is small. The current study examines whether a prize draw is more effective if it is divided into a few larger amounts versus several smaller amounts and compares prize draws to a small but guaranteed postpaid incentive. Data are from the 2019 Campus Climate Survey on Sexual Assault and Sexual Misconduct. Sample members include 38,434 undergraduate and graduate students at a large Midwestern university who were randomly assigned to receive: a guaranteed $5 Amazon gift card; entry into a high-payout drawing for one of four $500 prizes; or entry into a low-payout drawing for one of twenty $100 prizes. Results indicate the guaranteed incentive increased response rates, with no difference between the prize draws. While results from various data quality outcomes show the guaranteed incentive reduced break-off rates and the high-payout drawing increased item nonresponse, there were no differences across incentive conditions in rates of speeding, reporting of sensitive data, straightlining, or sample representativeness. As expected, the prize draws had much lower overall and per complete costs.
期刊介绍:
Unique Scope Social Science Computer Review is an interdisciplinary journal covering social science instructional and research applications of computing, as well as societal impacts of informational technology. Topics included: artificial intelligence, business, computational social science theory, computer-assisted survey research, computer-based qualitative analysis, computer simulation, economic modeling, electronic modeling, electronic publishing, geographic information systems, instrumentation and research tools, public administration, social impacts of computing and telecommunications, software evaluation, world-wide web resources for social scientists. Interdisciplinary Nature Because the Uses and impacts of computing are interdisciplinary, so is Social Science Computer Review. The journal is of direct relevance to scholars and scientists in a wide variety of disciplines. In its pages you''ll find work in the following areas: sociology, anthropology, political science, economics, psychology, computer literacy, computer applications, and methodology.