{"title":"提高外呼 CATI 作为基于地址样本的无应答随访模式的效率:动态自适应设计的准实验评估","authors":"Michael T Jackson, Todd Hughes, Jiangzhou Fu","doi":"10.1093/jssam/smae005","DOIUrl":null,"url":null,"abstract":"\n This article evaluates the use of dynamic adaptive design methods to target outbound computer-assisted telephone interviewing (CATI) in the California Health Interview Survey (CHIS). CHIS is a large-scale, annual study that uses an address-based sample (ABS) with push-to-Web mailings, followed by outbound CATI follow-up for addresses with appended phone numbers. CHIS 2022 implemented a dynamic adaptive design in which predictive models were used to end dialing early for some cases. For addresses that received outbound CATI follow-up, dialing was paused after three calls. A response propensity (RP) model was applied to predict the probability that the address would respond to continued dialing, based on the outcomes of the first three calls. Low-RP addresses were permanently retired with no additional dialing, while the rest continued through six or more attempts. We use a difference-in-difference design to evaluate the effect of the adaptive design on calling effort, completion rates, and the demographic composition of respondents. We find that the adaptive design reduced the mean number of calls per sampled unit by about 14 percent (relative to a modeled no-adaptive-design counterfactual) with a minimal reduction in the completion rate and no strong evidence of changes in the prevalence of target demographics. This suggests that RP modeling can meaningfully distinguish between ABS sample units for which additional dialing is and is not productive, helping to control outbound dialing costs without compromising sample representativeness.","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":null,"pages":null},"PeriodicalIF":1.6000,"publicationDate":"2024-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improving the Efficiency of Outbound CATI As a Nonresponse Follow-Up Mode in Address-Based Samples: A Quasi-Experimental Evaluation of a Dynamic Adaptive Design\",\"authors\":\"Michael T Jackson, Todd Hughes, Jiangzhou Fu\",\"doi\":\"10.1093/jssam/smae005\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n This article evaluates the use of dynamic adaptive design methods to target outbound computer-assisted telephone interviewing (CATI) in the California Health Interview Survey (CHIS). CHIS is a large-scale, annual study that uses an address-based sample (ABS) with push-to-Web mailings, followed by outbound CATI follow-up for addresses with appended phone numbers. CHIS 2022 implemented a dynamic adaptive design in which predictive models were used to end dialing early for some cases. For addresses that received outbound CATI follow-up, dialing was paused after three calls. A response propensity (RP) model was applied to predict the probability that the address would respond to continued dialing, based on the outcomes of the first three calls. Low-RP addresses were permanently retired with no additional dialing, while the rest continued through six or more attempts. We use a difference-in-difference design to evaluate the effect of the adaptive design on calling effort, completion rates, and the demographic composition of respondents. We find that the adaptive design reduced the mean number of calls per sampled unit by about 14 percent (relative to a modeled no-adaptive-design counterfactual) with a minimal reduction in the completion rate and no strong evidence of changes in the prevalence of target demographics. This suggests that RP modeling can meaningfully distinguish between ABS sample units for which additional dialing is and is not productive, helping to control outbound dialing costs without compromising sample representativeness.\",\"PeriodicalId\":17146,\"journal\":{\"name\":\"Journal of Survey Statistics and Methodology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-03-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Survey Statistics and Methodology\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1093/jssam/smae005\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"SOCIAL SCIENCES, MATHEMATICAL METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Survey Statistics and Methodology","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1093/jssam/smae005","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SOCIAL SCIENCES, MATHEMATICAL METHODS","Score":null,"Total":0}
Improving the Efficiency of Outbound CATI As a Nonresponse Follow-Up Mode in Address-Based Samples: A Quasi-Experimental Evaluation of a Dynamic Adaptive Design
This article evaluates the use of dynamic adaptive design methods to target outbound computer-assisted telephone interviewing (CATI) in the California Health Interview Survey (CHIS). CHIS is a large-scale, annual study that uses an address-based sample (ABS) with push-to-Web mailings, followed by outbound CATI follow-up for addresses with appended phone numbers. CHIS 2022 implemented a dynamic adaptive design in which predictive models were used to end dialing early for some cases. For addresses that received outbound CATI follow-up, dialing was paused after three calls. A response propensity (RP) model was applied to predict the probability that the address would respond to continued dialing, based on the outcomes of the first three calls. Low-RP addresses were permanently retired with no additional dialing, while the rest continued through six or more attempts. We use a difference-in-difference design to evaluate the effect of the adaptive design on calling effort, completion rates, and the demographic composition of respondents. We find that the adaptive design reduced the mean number of calls per sampled unit by about 14 percent (relative to a modeled no-adaptive-design counterfactual) with a minimal reduction in the completion rate and no strong evidence of changes in the prevalence of target demographics. This suggests that RP modeling can meaningfully distinguish between ABS sample units for which additional dialing is and is not productive, helping to control outbound dialing costs without compromising sample representativeness.
期刊介绍:
The Journal of Survey Statistics and Methodology, sponsored by AAPOR and the American Statistical Association, began publishing in 2013. Its objective is to publish cutting edge scholarly articles on statistical and methodological issues for sample surveys, censuses, administrative record systems, and other related data. It aims to be the flagship journal for research on survey statistics and methodology. Topics of interest include survey sample design, statistical inference, nonresponse, measurement error, the effects of modes of data collection, paradata and responsive survey design, combining data from multiple sources, record linkage, disclosure limitation, and other issues in survey statistics and methodology. The journal publishes both theoretical and applied papers, provided the theory is motivated by an important applied problem and the applied papers report on research that contributes generalizable knowledge to the field. Review papers are also welcomed. Papers on a broad range of surveys are encouraged, including (but not limited to) surveys concerning business, economics, marketing research, social science, environment, epidemiology, biostatistics and official statistics. The journal has three sections. The Survey Statistics section presents papers on innovative sampling procedures, imputation, weighting, measures of uncertainty, small area inference, new methods of analysis, and other statistical issues related to surveys. The Survey Methodology section presents papers that focus on methodological research, including methodological experiments, methods of data collection and use of paradata. The Applications section contains papers involving innovative applications of methods and providing practical contributions and guidance, and/or significant new findings.