P. Biemer, K. Harris, Dan Liao, B. Burke, C. Halpern
{"title":"Modelling Mode Effects for a Panel Survey in Transition","authors":"P. Biemer, K. Harris, Dan Liao, B. Burke, C. Halpern","doi":"10.1093/oso/9780198859987.003.0004","DOIUrl":null,"url":null,"abstract":"Funding reductions combined with increasing data-collection costs required that Wave V of the USA’s National Longitudinal Study of Adolescent to Adult Health (Add Health) abandon its traditional approach of in-person interviewing and adopt a more cost-effective method. This approach used the mail/web mode in Phase 1 of data collection and in-person interviewing for a random sample of nonrespondents in Phase 2. In addition, to facilitate the comparison of modes, a small random subsample served as the control and received the traditional in-person interview. We show that concerns about reduced data quality as a result of the redesign effort were unfounded based on findings from an analysis of the survey data. In several important respects, the new two-phase, mixed-mode design outperformed the traditional design with greater measurement accuracy, improved weighting adjustments for mitigating the risk of nonresponse bias, reduced residual (or post-adjustment) nonresponse bias, and substantially reduced total-mean-squared error of the estimates. This good news was largely unexpected based upon the preponderance of literature suggesting data quality could be adversely affected by the transition to a mixed mode. The bad news is that the transition comes with a high risk of mode effects for comparing Wave V and prior wave estimates. Analytical results suggest that significant differences can occur in longitudinal change estimates about 60 % of the time purely as an artifact of the redesign. This begs the question: how, then, should a data analyst interpret significant findings in a longitudinal analysis in the presence of mode effects? This chapter presents the analytical results and attempts to address this question.","PeriodicalId":231734,"journal":{"name":"Measurement Error in Longitudinal Data","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Measurement Error in Longitudinal Data","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780198859987.003.0004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Funding reductions combined with increasing data-collection costs required that Wave V of the USA’s National Longitudinal Study of Adolescent to Adult Health (Add Health) abandon its traditional approach of in-person interviewing and adopt a more cost-effective method. This approach used the mail/web mode in Phase 1 of data collection and in-person interviewing for a random sample of nonrespondents in Phase 2. In addition, to facilitate the comparison of modes, a small random subsample served as the control and received the traditional in-person interview. We show that concerns about reduced data quality as a result of the redesign effort were unfounded based on findings from an analysis of the survey data. In several important respects, the new two-phase, mixed-mode design outperformed the traditional design with greater measurement accuracy, improved weighting adjustments for mitigating the risk of nonresponse bias, reduced residual (or post-adjustment) nonresponse bias, and substantially reduced total-mean-squared error of the estimates. This good news was largely unexpected based upon the preponderance of literature suggesting data quality could be adversely affected by the transition to a mixed mode. The bad news is that the transition comes with a high risk of mode effects for comparing Wave V and prior wave estimates. Analytical results suggest that significant differences can occur in longitudinal change estimates about 60 % of the time purely as an artifact of the redesign. This begs the question: how, then, should a data analyst interpret significant findings in a longitudinal analysis in the presence of mode effects? This chapter presents the analytical results and attempts to address this question.