Survey practicePub Date : 2019-02-25DOI: 10.29115/SP-2018-0034
S. Giroux, K. Tharp, Derek C. Wietelman
{"title":"Impacts of Implementing an Automatic Advancement Feature in Mobile and Web Surveys","authors":"S. Giroux, K. Tharp, Derek C. Wietelman","doi":"10.29115/SP-2018-0034","DOIUrl":"https://doi.org/10.29115/SP-2018-0034","url":null,"abstract":"As more surveys are administered online and taken on mobile devices, researchers need to continue to work toward defining best practices. A relatively new technical feature is an automatic advancement feature, which automatically scrolls the screen to the next item on a page of a survey instrument once a respondent selects an answer to the current question. This and other related forms of automatic advancement methods, such as the horizontal scrolling matrix, have so far produced mixed effects in terms of their impact on data quality, break off rates, and other survey measures. We present findings from an experiment conducted in 2017 using a survey of law school students in which we randomly assigned an automatic advancement feature to half of the respondents in a sample of nearly 40,000 students in the United States and Canada. We do not find significant differences in survey duration time, straight-lining, breakoff rates, or item nonresponse (for mobile users) between the two experimental groups, but desktop users without the automatic advancement feature had higher item nonresponse. More significantly, we find that respondents receiving the automatic advancement treatment on average changed about 50% fewer answers across the survey instrument than those who did not receive the automatic advancement design. We incorporate qualitative data to help us understand why this might have happened. Finally, respondents rated the automatic advancement feature as being easier to use and having a better visual design. This research extends the limited amount of work on automatic advancement features for surveys, helps move the field closer to best practices for survey design for mobile devices, and provides context and understanding about a particular respondent behavior related to data quality.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2019-02-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46894353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2019-01-01Epub Date: 2019-06-24DOI: 10.29115/SP-2019-0003
Megan E Patrick, Mick P Couper, Bohyun Joy Jang, Virginia Laetz, John E Schulenberg, Lloyd D Johnston, Jerald Bachman, Patrick M O'Malley
{"title":"Two-Year Follow-up of a Sequential Mixed-Mode Experiment in the U.S. National Monitoring the Future Study.","authors":"Megan E Patrick, Mick P Couper, Bohyun Joy Jang, Virginia Laetz, John E Schulenberg, Lloyd D Johnston, Jerald Bachman, Patrick M O'Malley","doi":"10.29115/SP-2019-0003","DOIUrl":"https://doi.org/10.29115/SP-2019-0003","url":null,"abstract":"<p><p>This study examines the two-year follow up (data collected in 2016 at modal age 21/22) of an original mixed-mode longitudinal survey experiment (data collected at modal age 19/20 in 2014). The study compares participant retention in the experimental conditions to retention in the standard Monitoring the Future (MTF) control condition (participants who completed an in-school baseline survey in 12<sup>th</sup> grade in 2012 or 2013 and were selected to participate in the first follow-up survey by mail in 2014, <i>N</i>=2,451). A supplementary sample who completed the 12<sup>th</sup> grade baseline survey in 2012 or 2013 but were <i>not</i> selected to participate in the main MTF follow-up (<i>N</i>=4,950) were recruited and randomly assigned to one of three experimental conditions: 1: Mail Push, 2: Web Push, 3: Web Push + Email in 2014 and again in 2016. Results from the first experiment indicated that Condition 3 (Web Push + Email) was promising based on similar response rates and lower costs (Patrick et al. 2018). The current study examines how experimental condition and type of 2014 response were associated with response in 2016, the extent to which response mode and device type changed from 2014 to 2016, and cumulative cost comparisons across conditions. Results indicated that responding via web in 2014 was associated with greater odds of participation again in 2016 regardless of condition; respondents tended to respond in the same mode although the \"push\" condition did move respondents toward web over paper; device type varied between waves; and the cumulative cost savings of Web Push + Email grew larger compared to the MTF Control. The web push strategy is therefore promising for maintaining respondent engagement while reducing cost.</p>","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":"12 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6924618/pdf/nihms-1060109.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"37483060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2019-01-01Epub Date: 2019-07-01DOI: 10.29115/sp-2019-0004
Carol Pierannunzi, Ashley Hyon, Jeff Bareham, Machell Town
{"title":"Sample and Respondent Provided County Comparisons Among Cellular Respondents Using Rate Center Assignments.","authors":"Carol Pierannunzi, Ashley Hyon, Jeff Bareham, Machell Town","doi":"10.29115/sp-2019-0004","DOIUrl":"10.29115/sp-2019-0004","url":null,"abstract":"<p><p>The percentage of cell phones in telephone survey samples continues to grow in proportion to the percentage of potential respondents who rely on cell phones for personal communication. One problem with cell phone samples is that persons who move or who purchase cell phones in locations not close to their residence, may not be eligible for surveys with geographic parameters. This affects researchers' ability to sample and analyze from specific geographic jurisdictions. Because cell phone numbers do not accurately indicate respondent locations, rate centers have been used to ascertain respondent locations in recent years. The Behavioral Risk Factor Surveillance System (BRFSS) is a state-based telephone survey administered to over 400,000 respondents annually. Approximately half of the sample is drawn from cell phone numbers. This research examines the county-level accuracy of the 2016 BRFSS sample. Results indicate that cell phone samples are accurate at the state and county level 58% of the time and at just the state level 93% of the time. However, accuracy rates vary by state, region, metropolitan status as well as by demographic characteristics and survey items. Specific examples of when county-level accuracy vary are provided.</p>","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":"12 1","pages":"1-8"},"PeriodicalIF":0.0,"publicationDate":"2019-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8040621/pdf/nihms-1680309.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25589237","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2018-12-09DOI: 10.29115/SP-2018-0033
M. Berzofsky, Tasseli E. McKay, Y. Hsieh, A. C. Smith
{"title":"Probability-Based Samples on Twitter: Methodology and Application","authors":"M. Berzofsky, Tasseli E. McKay, Y. Hsieh, A. C. Smith","doi":"10.29115/SP-2018-0033","DOIUrl":"https://doi.org/10.29115/SP-2018-0033","url":null,"abstract":"Social media platforms such as Facebook and Twitter can be excellent resources for collecting data, especially when targeting a hard-to-reach or rare population. However, data collection through social media has generally relied on non-probability methods, which limits inference from the sample to the sample set. This paper demonstrates a methodology to select a random, probability-based sample from Twitter. We apply our methods to a survey of youth (persons aged 14 to 21) with the goal of oversampling sexual and gender minorities. We offer recommendations for how our methodology can be reproduced with other populations and provide suggestions on how to improve the methodology to ensure response targets are achieved.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-12-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41798405","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2018-11-12DOI: 10.29115/SP-2018-0031
Jenny Marlar, M. Chattopadhyay, Jeff Jones, Stephanie Marken, F. Kreuter
{"title":"Within-Household Selection and Dual-Frame Telephone Surveys: A Comparative Experiment of Eleven Different Selection Methods","authors":"Jenny Marlar, M. Chattopadhyay, Jeff Jones, Stephanie Marken, F. Kreuter","doi":"10.29115/SP-2018-0031","DOIUrl":"https://doi.org/10.29115/SP-2018-0031","url":null,"abstract":"Numerous within-household selection methods have been tested in general population surveys since the advent of telephone interviewing. However, very few selection studies, if any, have been conducted with a dual frame (landline and cell phone) sample. Landline and cell phone frames are known to represent demographically different groups of respondents, and selection methods that may result in more representative demographics in a landline frame may actually skew the results when combined with the cell phone frame. This study tested 11 different within-household selection methods with approximately 11,000 landline respondents. A parallel cell phone sample was also collected with 1,000 respondents, and the frames were combined for analysis. The selection methods tested included one probability-based method, four quasi-probability methods and six nonprobability methods. The methods were evaluated on four criteria: response rates, accuracy, demographic representation and substantive results. The demographic representativeness of each method was examined for the landline frame only and for the dual (landline and cell phone) frame combination. The probability method had the lowest response rate, while the nonprobability at-home methods had the highest. Accuracy rates were lowest for the quasi-probability birthday methods. There were few demographic differences between selection methods, and no substantive differences, when combined with the cell phone sample.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44859594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2018-10-28DOI: 10.29115/SP-2018-0029
Elizabeth M. Brown, Lindsay T Olson, M. Farrelly, J. Nonnemaker, H. Battles, J. Hampton
{"title":"Comparing Response Rates, Costs, and Tobacco-Related Outcomes Across Phone, Mail, and Online Surveys","authors":"Elizabeth M. Brown, Lindsay T Olson, M. Farrelly, J. Nonnemaker, H. Battles, J. Hampton","doi":"10.29115/SP-2018-0029","DOIUrl":"https://doi.org/10.29115/SP-2018-0029","url":null,"abstract":"Tobacco control evaluation relies on surveillance documenting attitudes and behaviors. Practitioners and evaluators seek the most efficient, high-quality surveillance methodology. This paper explores the feasibility of mail and online survey data collection protocols using address-based sampling (ABS) to complement landline and cell phone surveys as part of a comprehensive tobacco control program evaluation. We used a comparative study of response rates, costs, and key outcomes across phone, mail, and online survey protocols. In 2015, we supplemented the phone-administered NY Adult Tobacco Survey with an ABS for paper and online data collection. For each survey protocol, we calculated response rates; compared unweighted demographic characteristics; and compared weighted outcome data for smoking prevalence, quit attempts, and tobacco control policy support. We assessed relative cost-per-complete per protocol. Response rates were highest for paper surveys (38.9%), followed by online (28.6%), landline (22.2%), and cell phone (14.7%) surveys. Respondent demographics differed across protocols; landline, mail, and online respondents were more likely than cell phone respondents to be older, female, white, and higher educated. Smoking prevalence varied by protocol, but quit attempts and tobacco control policy support were similar across protocols. Cost-per-complete estimates were lowest for paper surveys. Programs rely on efficient and representative methodologies, and paper and online surveys with ABS show promise for supplementing phone surveillance to improve response rates and lower costs per completed survey.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43951794","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2018-10-22DOI: 10.29115/SP-2018-0027
T. Marshall
{"title":"Taking Another Look at Counterarguments: When do Survey Respondents Switch their Answers?","authors":"T. Marshall","doi":"10.29115/SP-2018-0027","DOIUrl":"https://doi.org/10.29115/SP-2018-0027","url":null,"abstract":"Counterarguments are an often asked but little studied question format. In this format, a respondent first offers an opinion to a question, and depending upon that answer, is then asked one or more additional questions each with information that might cause a respondent to switch his or her earlier answer. In past studies, a third to a half of respondents typically switched their original answer. This meta-analysis of all the identifiable counterarguments from iPOLL’s online archive reports that switching occurs at different rates across different issues. Switching also most often occurs if respondents favored the prior question, if questioning occurs later during the survey, and if other counterarguments were previously asked of a respondent. Mode and house effects also appear.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48434554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2018-10-15DOI: 10.29115/SP-2018-0032
S. Keeter
{"title":"Ask the Expert: Polling the 2018 Elections","authors":"S. Keeter","doi":"10.29115/SP-2018-0032","DOIUrl":"https://doi.org/10.29115/SP-2018-0032","url":null,"abstract":"As the 2018 elections approach, polling finds itself in the spotlight once again and the glare is harsh. Facing the growing problems that confront all of survey research as well as public skepticism about polling that followed the 2016 presidential election, polling practitioners have examined their methods and many have made changes. For this edition of “Ask the Expert,” *Survey Practice* approached a wide range of public pollsters and sought their views about the environment for election polling this year and any changes they have made in how they conduct their polls.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43609102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2018-09-24DOI: 10.29115/SP-2018-0028
T. Jamieson, Güez Salinas
{"title":"Protecting Human Subjects in the Digital Age: Issues and Best Practices of Data Protection","authors":"T. Jamieson, Güez Salinas","doi":"10.29115/SP-2018-0028","DOIUrl":"https://doi.org/10.29115/SP-2018-0028","url":null,"abstract":"Public opinion and survey researchers must protect the privacy and confidentiality of human subjects. However, scholars are often not trained in the best practices of data storage, and there is a serious risk that survey data might be compromised by pernicious actors. In an era when it is becoming increasingly difficult to recruit participants, breaches could further challenge our ability to conduct surveys if we cannot guarantee that participants’ data will remain confidential and private. While any computer-based data has some vulnerability, we introduce simple measures that will better protect the confidentiality and privacy of human subjects. We hope these could become standard practice to protect human subjects in the future.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49462251","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Survey practicePub Date : 2018-09-11DOI: 10.29115/SP-2018-0026
N. Parker
{"title":"Interview with an Expert Series – Usability Testing in Survey Research","authors":"N. Parker","doi":"10.29115/SP-2018-0026","DOIUrl":"https://doi.org/10.29115/SP-2018-0026","url":null,"abstract":"This installment of the Interview with an Expert Series focuses on the topic of Usability Testing. I had the pleasure of interviewing two experts in the field, Emily Geisen, Survey Methodologist and Usability/Cognitive Testing Manager with RTI International, and Jennifer Romano Bergstrom, the Director of User Experience Research at Bridgewater Associates. Emily and Jennifer came to this field from different avenues, but both have developed a deep appreciation for the value of testing early and often when developing a survey instrument. In the interview, they discuss their career paths, the usability testing process, and much more.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46701568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}