{"title":"QUESTIONNAIRE COMPLEXITY, REST PERIOD, AND RESPONSE LIKELIHOOD IN ESTABLISHMENT SURVEYS","authors":"J. Rodhouse, T. Wilson, Heather E Ridolfo","doi":"10.1093/JSSAM/SMAB017","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAB017","url":null,"abstract":"\u0000 Response burden has been a concern in survey research for some time. One area of concern is the negative impact that response burden can have on response rates. In an effort to mitigate negative impacts on response rates, survey research organizations try to minimize the burden respondents are exposed to and maximize the likelihood of response. Many organizations also try to be mindful of the role burden may play in respondents’ likelihood to participate in future surveys by implementing rest periods or survey holidays. Recently, new evidence from a study of cross-sectional household surveys provided an interesting lens to examine burden. The evidence demonstrated that those sampled in two independent surveys are more likely to respond to the second survey if the first survey was more difficult to complete, and that this effect was not significantly influenced by the rest period in between the two surveys. These findings are compelling, and since the mechanisms influencing response in household and establishment surveys differ in important ways, a similar examination in an establishment survey context is warranted. To accomplish this, data are used from the National Agricultural Statistics Service. Overall, our research finds that prior survey features such as questionnaire complexity (or burden), prior response disposition and rest period are significantly associated with response to subsequent surveys. We also find that sample units first receiving a more complex questionnaire have significantly higher probabilities of response to a subsequent survey than do those receiving a simpler questionnaire first. The findings in this paper have implications for nonresponse adjustments and identification of subgroups for adaptive design data collection.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"41839493","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A DYNAMIC SURVIVAL MODELING APPROACH TO THE PREDICTION OF WEB SURVEY BREAKOFF","authors":"Felicitas Mittereder, B. West","doi":"10.1093/JSSAM/SMAB015","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAB015","url":null,"abstract":"\u0000 Respondents who break off from a web survey prior to completing it are a prevalent problem in data collection. To prevent breakoff bias, it is crucial to keep as many diverse respondents in a web survey as possible. As a first step of preventing breakoffs, this study aims to understand breakoff and the associated response behavior. We analyze data from an annual online survey using dynamic survival models and ROC analyses. We find that breakoff risks between respondents using mobile devices versus PCs do not differ at the beginning of the questionnaire, but the risk for mobile device users increases as the survey progresses. Very fast respondents as well as respondents with changing response times both have a higher risk of quitting the questionnaire, compared to respondents with slower and steady response times. We conclude with a discussion of the implications of these findings for future practice and research in web survey methodology.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46746139","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Katharina Meitinger, A. Toroslu, Klara Raiber, Michael Braun
{"title":"PERCEIVED BURDEN, FOCUS OF ATTENTION, AND THE URGE TO JUSTIFY: THE IMPACT OF THE NUMBER OF SCREENS AND PROBE ORDER ON THE RESPONSE BEHAVIOR OF PROBING QUESTIONS","authors":"Katharina Meitinger, A. Toroslu, Klara Raiber, Michael Braun","doi":"10.1093/JSSAM/SMAA043","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAA043","url":null,"abstract":"\u0000 Web probing is a valuable tool to assess the validity and comparability of survey items. It uses different probe types—such as category-selection probes and specific probes—to inquire about different aspects of an item. Previous web probing studies often asked one probe type per item, but research situations exist where it might be preferable to test potentially problematic items with multiple probes. However, the response behavior might be affected by two factors: question order and the visual presentation of probes on one screen versus multiple screens as well as their interaction. In this study, we report evidence from a web experiment that was conducted with 532 respondents from Germany in September 2013. Experimental groups varied by screen number (1 versus 2) and probe order (category-selection probe first versus specific probe first). We assessed the impact of these manipulations on several indicators of response quality, probe answer content, and the respondents’ motivation with logistic regressions and two-way ANOVAs. We reveal that multiple mechanisms push response behavior in this context: perceived response burden, the focus of attention, the need for justification, and verbal context effects. We find that response behavior in the condition with two screens and category-selection probe first outperforms all other experimental conditions. We recommend this implementation in all but one scenario: if the goal is to test an item that includes a key term with a potentially too large lexical scope, we recommend starting with a specific probe but on the same screen as the category-selection probe.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45424242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Wesołowski, R. Wieczorkowski, Wojciech W'ojciak
{"title":"Optimality of the Recursive Neyman Allocation","authors":"J. Wesołowski, R. Wieczorkowski, Wojciech W'ojciak","doi":"10.1093/jssam/smab018","DOIUrl":"https://doi.org/10.1093/jssam/smab018","url":null,"abstract":"\u0000 We derive a formula for the optimal sample allocation in a general stratified scheme under upper bounds on the sample strata sizes. Such a general scheme includes SRSWOR within strata as a special case. The solution is given in terms of V allocation with V being the set of take-all strata. We use V allocation to give a formal proof of optimality of the popular recursive Neyman algorithm, rNa. We also propose a quick proof of optimality of the algorithm of Stenger and Gabler, SGa, as well as of our proposed modification, coma. Finally, we compare running times of rNa, SGa, and coma. Ready-to-use R-implementations of these algorithms are available on CRAN repository at https://cran.r-project.org/web/packages/stratallo.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43743867","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modeling public opinion over time: A simulation study of latent trend models","authors":"M. Kołczyńska, P. Bürkner","doi":"10.31235/osf.io/gauvx","DOIUrl":"https://doi.org/10.31235/osf.io/gauvx","url":null,"abstract":"Analyzing trends in public opinion is important for monitoring social change and for testing theories aimed at explaining this change. With growing availability of multi-wave surveys, social scientists are increasingly turning to latent trend models applied to survey data for examining changes in social and political attitudes. With the aim of facilitating this research, our study compares different approaches to modeling latent trends of aggregate public opinion: splines, Gaussian processes, and discrete autoregressive models. We examine the ability of these models to recover latent trends with simulated data that vary with regard to the frequency and magnitude of changes in the true trend, model complexity and data availability. Overall, we find that all three latent trend models perform well in all scenarios, even the most difficult ones with frequent and weak changes of the latent trend and sparse data. The two main performance differences we find include the relatively higher squared errors of autoregressive models compared to the other models, and the under-coverage of posterior intervals in high-frequency low-amplitude trends with splines. For all models and across all scenarios performance improves with increased data availability, which emphasizes the need of supplying sufficient data for accurate estimation of latent trends.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49120772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Risk of Nonresponse Bias and the Length of the Field Period in a Mixed-Mode General Population Panel","authors":"Bella Struminskaya, Tobias Gummer","doi":"10.1093/JSSAM/SMAB011","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAB011","url":null,"abstract":"\u0000 Survey researchers are often confronted with the question of how long to set the length of the field period. Longer fielding time might lead to greater participation yet requires survey managers to devote more of their time to data collection efforts. With the aim of facilitating the decision about the length of the field period, we investigated whether a longer fielding time reduces the risk of nonresponse bias to judge whether field periods can be ended earlier without endangering the performance of the survey. By using data from six waves of a probability-based mixed-mode (online and mail) panel of the German population, we analyzed whether the risk of nonresponse bias decreases over the field period by investigating how day-by-day coefficients of variation develop during the field period. We then determined the optimal cut-off points for each mode after which data collection can be terminated without increasing the risk of nonresponse bias and found that the optimal cut-off points differ by mode. Our study complements prior research by shifting the perspective in the investigation of the risk of nonresponse bias to panel data as well as to mixed-mode surveys, in particular. Our proposed method of using coefficients of variation to assess whether the risk of nonresponse bias decreases significantly with each additional day of fieldwork can aid survey practitioners in finding the optimal field period for their mixed-mode surveys.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JSSAM/SMAB011","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45924091","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Wolter, C. Cappa, E. Erosheva, J. Madans, Kristen Miller, P. Scanlon, J. Weeks
{"title":"JSSAM Special Issue on Disability Measurement and Analysis: Preface","authors":"K. Wolter, C. Cappa, E. Erosheva, J. Madans, Kristen Miller, P. Scanlon, J. Weeks","doi":"10.1093/JSSAM/SMAB016","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAB016","url":null,"abstract":"","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":"9 1","pages":"205-208"},"PeriodicalIF":2.1,"publicationDate":"2021-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"46977206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Exploring the Feasibility of Recruiting Respondents and Collecting Web Data Via Smartphone: A Case Study of Text-To-Web Recruitment for a General Population Survey in Germany","authors":"Hannah Bucher, Matthias Sand","doi":"10.1093/JSSAM/SMAB006","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAB006","url":null,"abstract":"\u0000 The widespread usage of smartphones, as well as their technical features, offers many opportunities for survey research. As a result, the importance and popularity of smartphone surveys is steadily increasing. To explore the feasibility of a new text-to-web approach for surveying people directly via their smartphones, we conducted a case study in Germany in which we recruited respondents from a mobile random digit dialing sample via text messages that included a link to a web survey. We show that, although this survey approach is feasible, it is hampered by a number of issues, namely a high loss of numbers at the invitation stage, and a high rate of implicit refusals on the landing page of the survey.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JSSAM/SMAB006","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44539101","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"CORRIGENDUM TO: USING AMERICAN COMMUNITY SURVEY DATA TO IMPROVE ESTIMATES FROM SMALLER U.S. SURVEYS THROUGH BIVARIATE SMALL AREA ESTIMATION MODELS","authors":"Carolina Franco, W. Bell","doi":"10.1093/JSSAM/SMAB010","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAB010","url":null,"abstract":"","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JSSAM/SMAB010","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43124262","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yuliya Kosyakova, Lukas Olbrich, J. Sakshaug, Silvia Schwanhäuser
{"title":"Positive Learning or Deviant Interviewing? Mechanisms of Experience on Interviewer Behavior","authors":"Yuliya Kosyakova, Lukas Olbrich, J. Sakshaug, Silvia Schwanhäuser","doi":"10.1093/JSSAM/SMAB003","DOIUrl":"https://doi.org/10.1093/JSSAM/SMAB003","url":null,"abstract":"\u0000 Interviewer (mis)behavior has been shown to change with interviewers’ professional experience (general experience) and experience gained during the field period (survey experience). We extend this study by using both types of experiences to analyze interviewer effects on a core quality indicator: interview duration. To understand whether the effect of interviewer experience on duration is driven by increased efficiency or deviant behavior—both mechanisms of shorter interview durations—we additionally examine the triggering rate of filter questions to avoid burdensome follow-up questions and response differentiation over the field period. Using multilevel models and data from a large-scale survey on a special and difficult-to-interview population of refugees in Germany, we find that interview duration decreases with increasing survey experience, particularly among the generally inexperienced interviewers. However, this effect is not found for the triggering rate and response differentiation. The results are robust to different sample and model specifications. We conclude that the underlying mechanism driving interview duration is related to increasing efficiency, and not deviant behavior.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":" ","pages":""},"PeriodicalIF":2.1,"publicationDate":"2021-02-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1093/JSSAM/SMAB003","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48472036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}