{"title":"在一项长期的小组研究中,分析潜在的不可忽视的选择偏差。","authors":"Heather M Schroeder, Brady T West","doi":"10.1093/jssam/smae039","DOIUrl":null,"url":null,"abstract":"<p><p>Typical design-based methods for weighting probability samples rely on several assumptions, including the random selection of sampled units according to known probabilities of selection and ignorable unit nonresponse. If any of these assumptions are not met, weighting methods that account for the probabilities of selection, nonresponse, and calibration may not fully account for the potential selection bias in a given sample, which could produce misleading population estimates. This analysis investigates possible selection bias in the 2019 Health Survey Mailer (HSM), a sub-study of the longitudinal Health and Retirement Study (HRS). The primary HRS data collection has occurred in \"even\" years since 1992, but additional survey data collections take place in the \"off-wave\" odd years via mailed invitations sent to selected participants. While the HSM achieved a high response rate (83 percent), the assumption of ignorable probability-based selection of HRS panel members may not hold due to the eligibility criteria that were imposed. To investigate this possible non-ignorable selection bias, our analysis utilizes a novel analysis method for estimating measures of unadjusted bias for proportions (MUBP), introduced by Andridge et al. in 2019. This method incorporates aggregate information from the larger HRS target population, including means, variances, and covariances for key covariates related to the HSM variables, to inform estimates of proportions. We explore potential non-ignorable selection bias by comparing proportions calculated from the HSM under three conditions: ignoring HRS weights, weighting based on the usual design-based approach for HRS \"off-wave\" mail surveys, and using the MUBP adjustment. We find examples of differences between the weighted and MUBP-adjusted estimates in four out of ten outcomes we analyzed. However, these differences are modest, and while this result gives some evidence of non-ignorable selection bias, typical design-based weighting methods are sufficient for correcting for it and their use is appropriate in this case.</p>","PeriodicalId":17146,"journal":{"name":"Journal of Survey Statistics and Methodology","volume":"13 1","pages":"100-127"},"PeriodicalIF":1.6000,"publicationDate":"2024-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11770253/pdf/","citationCount":"0","resultStr":"{\"title\":\"Analyzing Potential Non-Ignorable Selection Bias in an Off-Wave Mail Survey Implemented in a Long-Standing Panel Study.\",\"authors\":\"Heather M Schroeder, Brady T West\",\"doi\":\"10.1093/jssam/smae039\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Typical design-based methods for weighting probability samples rely on several assumptions, including the random selection of sampled units according to known probabilities of selection and ignorable unit nonresponse. If any of these assumptions are not met, weighting methods that account for the probabilities of selection, nonresponse, and calibration may not fully account for the potential selection bias in a given sample, which could produce misleading population estimates. This analysis investigates possible selection bias in the 2019 Health Survey Mailer (HSM), a sub-study of the longitudinal Health and Retirement Study (HRS). The primary HRS data collection has occurred in \\\"even\\\" years since 1992, but additional survey data collections take place in the \\\"off-wave\\\" odd years via mailed invitations sent to selected participants. While the HSM achieved a high response rate (83 percent), the assumption of ignorable probability-based selection of HRS panel members may not hold due to the eligibility criteria that were imposed. To investigate this possible non-ignorable selection bias, our analysis utilizes a novel analysis method for estimating measures of unadjusted bias for proportions (MUBP), introduced by Andridge et al. in 2019. This method incorporates aggregate information from the larger HRS target population, including means, variances, and covariances for key covariates related to the HSM variables, to inform estimates of proportions. We explore potential non-ignorable selection bias by comparing proportions calculated from the HSM under three conditions: ignoring HRS weights, weighting based on the usual design-based approach for HRS \\\"off-wave\\\" mail surveys, and using the MUBP adjustment. We find examples of differences between the weighted and MUBP-adjusted estimates in four out of ten outcomes we analyzed. However, these differences are modest, and while this result gives some evidence of non-ignorable selection bias, typical design-based weighting methods are sufficient for correcting for it and their use is appropriate in this case.</p>\",\"PeriodicalId\":17146,\"journal\":{\"name\":\"Journal of Survey Statistics and Methodology\",\"volume\":\"13 1\",\"pages\":\"100-127\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-10-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11770253/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Survey Statistics and Methodology\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1093/jssam/smae039\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/2/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q2\",\"JCRName\":\"SOCIAL SCIENCES, MATHEMATICAL METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Survey Statistics and Methodology","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1093/jssam/smae039","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/2/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"SOCIAL SCIENCES, MATHEMATICAL METHODS","Score":null,"Total":0}
Analyzing Potential Non-Ignorable Selection Bias in an Off-Wave Mail Survey Implemented in a Long-Standing Panel Study.
Typical design-based methods for weighting probability samples rely on several assumptions, including the random selection of sampled units according to known probabilities of selection and ignorable unit nonresponse. If any of these assumptions are not met, weighting methods that account for the probabilities of selection, nonresponse, and calibration may not fully account for the potential selection bias in a given sample, which could produce misleading population estimates. This analysis investigates possible selection bias in the 2019 Health Survey Mailer (HSM), a sub-study of the longitudinal Health and Retirement Study (HRS). The primary HRS data collection has occurred in "even" years since 1992, but additional survey data collections take place in the "off-wave" odd years via mailed invitations sent to selected participants. While the HSM achieved a high response rate (83 percent), the assumption of ignorable probability-based selection of HRS panel members may not hold due to the eligibility criteria that were imposed. To investigate this possible non-ignorable selection bias, our analysis utilizes a novel analysis method for estimating measures of unadjusted bias for proportions (MUBP), introduced by Andridge et al. in 2019. This method incorporates aggregate information from the larger HRS target population, including means, variances, and covariances for key covariates related to the HSM variables, to inform estimates of proportions. We explore potential non-ignorable selection bias by comparing proportions calculated from the HSM under three conditions: ignoring HRS weights, weighting based on the usual design-based approach for HRS "off-wave" mail surveys, and using the MUBP adjustment. We find examples of differences between the weighted and MUBP-adjusted estimates in four out of ten outcomes we analyzed. However, these differences are modest, and while this result gives some evidence of non-ignorable selection bias, typical design-based weighting methods are sufficient for correcting for it and their use is appropriate in this case.
期刊介绍:
The Journal of Survey Statistics and Methodology, sponsored by AAPOR and the American Statistical Association, began publishing in 2013. Its objective is to publish cutting edge scholarly articles on statistical and methodological issues for sample surveys, censuses, administrative record systems, and other related data. It aims to be the flagship journal for research on survey statistics and methodology. Topics of interest include survey sample design, statistical inference, nonresponse, measurement error, the effects of modes of data collection, paradata and responsive survey design, combining data from multiple sources, record linkage, disclosure limitation, and other issues in survey statistics and methodology. The journal publishes both theoretical and applied papers, provided the theory is motivated by an important applied problem and the applied papers report on research that contributes generalizable knowledge to the field. Review papers are also welcomed. Papers on a broad range of surveys are encouraged, including (but not limited to) surveys concerning business, economics, marketing research, social science, environment, epidemiology, biostatistics and official statistics. The journal has three sections. The Survey Statistics section presents papers on innovative sampling procedures, imputation, weighting, measures of uncertainty, small area inference, new methods of analysis, and other statistical issues related to surveys. The Survey Methodology section presents papers that focus on methodological research, including methodological experiments, methods of data collection and use of paradata. The Applications section contains papers involving innovative applications of methods and providing practical contributions and guidance, and/or significant new findings.