Survey practice最新文献

筛选
英文 中文
Applying Machine Learning to the Evaluation of Interviewer Performance 将机器学习应用于面试官绩效评估
Survey practice Pub Date : 2023-07-27 DOI: 10.29115/sp-2023-0007
Hanyu Sun, Ting Yan
{"title":"Applying Machine Learning to the Evaluation of Interviewer Performance","authors":"Hanyu Sun, Ting Yan","doi":"10.29115/sp-2023-0007","DOIUrl":"https://doi.org/10.29115/sp-2023-0007","url":null,"abstract":"Survey organizations have long used Computer Assisted Recorded Interviewing (CARI) to monitor interviewer performance. Conventionally, a human coder needs to first listen to the audio recording of the interactions between the interviewer and the respondent and then evaluate and code features of the question-and-answer sequence using a pre-specified coding scheme. Although prior research found that providing feedback to interviewers based on CARI was effective at improving interviewer performance, such coding process tends to be labor intensive and time consuming. To improve the effectiveness and efficiency of using CARI to monitor interviewer performance, we developed a pipeline that heavily draws on the use of machine learning to process audio recorded interviews. In particular, machine learning is used to detect who spoke at which turn in a question-level audio recording and to transcribe conversations at the turn level. This paper describes how the pipeline was used to detect interviewer falsification and to identify problematic interviewer behavior in both recordings of mock interviews and actual field interviews. The performance of the pipeline was discussed.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"48401974","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Comparing Amazon’s MTurk and a Sona Student Sample: A Test of Data Quality Using Attention and Manipulation Checks 比较亚马逊的MTurk和Sona学生样本:使用注意力和操纵检查的数据质量测试
Survey practice Pub Date : 2023-06-15 DOI: 10.29115/sp-2023-0005
Yani Zhao, Sherice Gearhart
{"title":"Comparing Amazon’s MTurk and a Sona Student Sample: A Test of Data Quality Using Attention and Manipulation Checks","authors":"Yani Zhao, Sherice Gearhart","doi":"10.29115/sp-2023-0005","DOIUrl":"https://doi.org/10.29115/sp-2023-0005","url":null,"abstract":"The need for cost-effective data collection leads researchers to explore options, especially for respondent-administered online surveys. Student samples are convenient and cheap for social scientists when students fit the target population. However, student samples are criticized for their homogeneity and lack of generalizability (Kees et al. 2017). Another low-cost option is Amazon Mechanical Turk (MTurk), a crowdsourcing platform used for collecting online data from a seemingly broader population. Despite the appeal, it is important to compare data quality. The purpose here is to compare data quality between MTurk and student samples. To control data quality, researchers rely on several tactics such as screener questions to exclude unqualified respondents (Arndt et al. 2022). Subjective manipulation check and attention-check are used to examine respondent engagement and performance. Completion speed might also indicate effort/attention. Since samples should collect data from participants resembling the target population, sample diversity also serves as an indicator of data quality in this study (Kees et al. 2017; Roulin 2015). However, it should be noted that having a diverse sample does not always guarantee higher sample quality, especially when conducting studies on a homogeneous population.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42525005","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mail to One or Mail to All? An Experiment (Sub)Sampling Drop Point Units in a Self-Administered Address-Based Sampling Frame Survey 发给一个人还是发给所有人?基于自管理的基于地址的采样帧调查中的实验(子)采样丢点单元
Survey practice Pub Date : 2023-05-18 DOI: 10.29115/sp-2023-0004
Taylor Lewis, Joseph McMichael, Charlotte Looby
{"title":"Mail to One or Mail to All? An Experiment (Sub)Sampling Drop Point Units in a Self-Administered Address-Based Sampling Frame Survey","authors":"Taylor Lewis, Joseph McMichael, Charlotte Looby","doi":"10.29115/sp-2023-0004","DOIUrl":"https://doi.org/10.29115/sp-2023-0004","url":null,"abstract":"Practitioners utilizing an address-based sampling frame for a self-administered, mail contact survey must decide on how to handle drop points, which are single delivery points or receptacles that service multiple households. A variety of strategies have been adopted, including sampling all units at the drop point or subsampling just one (or a portion) of them. This paper reports results from an experiment fielded during the 2021 Healthy Chicago Survey aimed at providing insight into whether there are any substantive differences between these approaches. We find that a subsampling strategy in which a single mailing is sent produces a roughly 3 percentage point higher response rate relative to a strategy sending multiple mailings concurrently to the drop point. While base-weighted distributions of gender and age differed enough to be statistically significant, there were no noteworthy differences across other demographics or across the base-weighted distributions of select key health outcomes measured by the survey. Taken together, these results provide some evidence that a “mail to one” drop point strategy is more efficient than a “mail to all” drop point strategy.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-05-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44977194","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Adapting Clinical Instruments for Population Mental Health Surveillance: Should an Explicit “Don’t Know” Response Option Be Given? 适应人群心理健康监测的临床工具:是否应该给出明确的“不知道”回应选项?
Survey practice Pub Date : 2023-04-26 DOI: 10.29115/sp-2023-0003
Rachel Suss, Tashema Bholanath, T. Dongchung, Amber Levanon Seligson, Christina C Norman, Sarah E. Dumas
{"title":"Adapting Clinical Instruments for Population Mental Health Surveillance: Should an Explicit “Don’t Know” Response Option Be Given?","authors":"Rachel Suss, Tashema Bholanath, T. Dongchung, Amber Levanon Seligson, Christina C Norman, Sarah E. Dumas","doi":"10.29115/sp-2023-0003","DOIUrl":"https://doi.org/10.29115/sp-2023-0003","url":null,"abstract":"","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43386856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
How Weighting by Past Vote Can Improve Estimates of Voting Intentions 过去投票的加权如何提高对投票意向的估计
Survey practice Pub Date : 2023-03-13 DOI: 10.29115/sp-2023-0001
D. Pennay, S. Misson, D. Neiger, P. Lavrakas
{"title":"How Weighting by Past Vote Can Improve Estimates of Voting Intentions","authors":"D. Pennay, S. Misson, D. Neiger, P. Lavrakas","doi":"10.29115/sp-2023-0001","DOIUrl":"https://doi.org/10.29115/sp-2023-0001","url":null,"abstract":"Polling error for the 2020 US election was the highest in 40 years and no mode of surveying was unambiguously more accurate. This occurred amid several recent polling failures in other countries. Online panels, as the dominant method now used by pollsters to survey voters, are well-positioned to help reduce the level of bias in pre-election polls. Here, we present a case for those pollsters using online panels for pre-election polling to (re)consider using past vote choice (i.e., whom respondents voted for in the previous election) as a weighting variable capable of reducing bias in their election forecasts under the right circumstances. Our data are from an Australian pre-election poll, conducted on a probability-based online panel one month prior to the 2019 Australian federal election. Three different measures of recalled vote choice for the 2016 election were used in weighting the forecast of the 2019 election outcome. These were (1) a short-term measure of recall for the 2016 vote choice obtained three months after the 2016 election, (2) a long-term measure obtained from the same panelists three years after the 2016 election and (3) a hybrid measure with a random half of panelists allocated their short-term past vote measure for 2016 and the remainder their long-term measure. We then examined the impacts on the bias and variance of the resulting estimates of the 2019 voting intentions. Using the short-term measure of the 2016 recalled vote choice in our weighting significantly reduced the bias of the resulting 2019 voting intentions forecast, with an acceptable impact on variance, and produced less biased estimates than when using either of the other two past vote measures. The short-term recall measure also generally resulted in better estimates than a weighting approach that did not include any past vote adjustment. Implications for panel providers are discussed.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-03-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49415885","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Null Effects of Framing Welcoming Ordinances 制定欢迎条例的无效影响
Survey practice Pub Date : 2023-02-23 DOI: 10.29115/sp-2023-0002
D. Doherty, D. Garbarski, Pablo Guzman Rivera
{"title":"Null Effects of Framing Welcoming Ordinances","authors":"D. Doherty, D. Garbarski, Pablo Guzman Rivera","doi":"10.29115/sp-2023-0002","DOIUrl":"https://doi.org/10.29115/sp-2023-0002","url":null,"abstract":"A substantial body of published work finds that seemingly trivial changes in question wording or the way an issue is framed can substantially affect the attitudes people report on surveys. We report findings from a survey of Cook County residents where seemingly strong issue framing treatments that varied the stated purpose of welcoming ordinance provisions failed to affect reported attitudes. We find no effect in the aggregate, nor do we find effects among demographic or other seemingly relevant subgroups. The findings illustrate the important but often overlooked fact that varying how an issue is framed does not always affect reported attitudes.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-02-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45355846","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Shy Respondent and Propensity to Participate in Surveys: A Proof-of-Concept Study 害羞的被调查者和参与调查的倾向:一项概念验证研究
Survey practice Pub Date : 2023-01-12 DOI: 10.29115/sp-2014-0026
John Boyle, James Dayton, Randy ZuWallack, Ronaldo Iachan
{"title":"The Shy Respondent and Propensity to Participate in Surveys: A Proof-of-Concept Study","authors":"John Boyle, James Dayton, Randy ZuWallack, Ronaldo Iachan","doi":"10.29115/sp-2014-0026","DOIUrl":"https://doi.org/10.29115/sp-2014-0026","url":null,"abstract":"The Democratic presidential support among voters was overstated in 88% of national polls in 2016 and 93% in 2020. The \"shy voter\" phenomenon from British electoral politics was one explanation offered in the 2016 elections. Although a similar pattern occurred in 2020 election polls, the evidence does not support misrepresentation of voting intent by Trump supporters as the explanation. However, an alternative hypothesis of a self-selection bias against Trump voters in pre-election surveys has been proposed. Moreover, if these Trump voters were less likely to participate in surveys due to a psychosocial predisposition, then their absence might not be corrected by sample weighting based on demographics and party affiliation. This study explores whether there is a segment of the population with a personality or behavioral predisposition that makes them want to avoid polls (\"shy respondents\") and whether this affects their likelihood of survey participation and voting. As part of a national survey of motivators and barriers to survey participation, we had a proxy measure of survey shyness: \"I prefer to stay out of sight and not be counted in government surveys.\" We compare this stated predisposition to both willingness to participate in surveys and likelihood of voting. We find this \"shy respondent\" measure is related to stated willingness to participate in future surveys. Although \"shy respondents\" were less likely to vote than others, a majority \"always\" vote in presidential elections. Collectively, \"shy respondents\" who are unlikely to participate in surveys represent about 10% of \"likely voters\" in presidential elections. This survey shyness is also related to political alienation measures, which may lead to underrepresentation of more populist leaning respondents in pre-election surveys. The relationship of survey shyness to demographics is generally slight so demographic weighting is unlikely to correct for underrepresentation of \"shy respondents\" in pre-election polls.","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-01-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"135995233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Recruiting Hard-to-Reach Populations Amid the COVID-19 Pandemic. 在COVID-19大流行期间招募难以接触到的人群
Survey practice Pub Date : 2023-01-01 Epub Date: 2023-08-17 DOI: 10.29115/sp-2023-0011
Jacob Boelter, Alexis M Dennis, Lisa Klein Vogel, Kenneth D Croes
{"title":"Recruiting Hard-to-Reach Populations Amid the COVID-19 Pandemic.","authors":"Jacob Boelter, Alexis M Dennis, Lisa Klein Vogel, Kenneth D Croes","doi":"10.29115/sp-2023-0011","DOIUrl":"10.29115/sp-2023-0011","url":null,"abstract":"<p><p>The COVID-19 pandemic introduced many challenges for conducting research, particularly for research studies reliant on community-based sample generation strategies. In late 2021, we undertook two qualitative research studies for which we needed to identify and recruit hard-to-reach populations from the community. This brief describes our approach to adapting traditional, in-person methods to virtual means of disseminating study information and connecting with potential participants, with implications for future recruitment efforts in situations when in-person options are constrained.</p>","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11173354/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45339209","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Timing estimates for complex programmed surveys 复杂方案调查的时间估计
Survey practice Pub Date : 2022-11-03 DOI: 10.29115/sp-2022-0011
E. Loewen, E. Bauer, M. Thompson, Nadia Martin, Anne C. K. Quah, G. Fong
{"title":"Timing estimates for complex programmed surveys","authors":"E. Loewen, E. Bauer, M. Thompson, Nadia Martin, Anne C. K. Quah, G. Fong","doi":"10.29115/sp-2022-0011","DOIUrl":"https://doi.org/10.29115/sp-2022-0011","url":null,"abstract":"","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-11-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"49610535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Assessing Measurement Error in Hypothetical Questions 假设性问题测量误差评估
Survey practice Pub Date : 2022-10-20 DOI: 10.29115/sp-2022-0010
Adam Kaderabek, J. Sinibaldi
{"title":"Assessing Measurement Error in Hypothetical Questions","authors":"Adam Kaderabek, J. Sinibaldi","doi":"10.29115/sp-2022-0010","DOIUrl":"https://doi.org/10.29115/sp-2022-0010","url":null,"abstract":"","PeriodicalId":74893,"journal":{"name":"Survey practice","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42719156","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信