{"title":"Divide and conquer for accelerated failure time model with massive time-to-event data","authors":"Wen Su, Guosheng Yin, Jing Zhang, Xingqiu Zhao","doi":"10.1002/cjs.11725","DOIUrl":null,"url":null,"abstract":"<p>Big data present new theoretical and computational challenges as well as tremendous opportunities in many fields. In health care research, we develop a novel divide-and-conquer (DAC) approach to deal with massive and right-censored data under the accelerated failure time model, where the sample size is extraordinarily large and the dimension of predictors is large but smaller than the sample size. Specifically, we construct a penalized loss function by approximating the weighted least squares loss function by combining estimation results without penalization from all subsets. The resulting adaptive LASSO penalized DAC estimator enjoys the oracle property. Simulation studies demonstrate that the proposed DAC procedure performs well and also reduces the computation time with satisfactory performance compared with estimation results using the full data. Our proposed DAC approach is applied to a massive dataset from the Chinese Longitudinal Healthy Longevity Survey.</p>","PeriodicalId":55281,"journal":{"name":"Canadian Journal of Statistics-Revue Canadienne De Statistique","volume":null,"pages":null},"PeriodicalIF":0.8000,"publicationDate":"2022-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Canadian Journal of Statistics-Revue Canadienne De Statistique","FirstCategoryId":"100","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/cjs.11725","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
Big data present new theoretical and computational challenges as well as tremendous opportunities in many fields. In health care research, we develop a novel divide-and-conquer (DAC) approach to deal with massive and right-censored data under the accelerated failure time model, where the sample size is extraordinarily large and the dimension of predictors is large but smaller than the sample size. Specifically, we construct a penalized loss function by approximating the weighted least squares loss function by combining estimation results without penalization from all subsets. The resulting adaptive LASSO penalized DAC estimator enjoys the oracle property. Simulation studies demonstrate that the proposed DAC procedure performs well and also reduces the computation time with satisfactory performance compared with estimation results using the full data. Our proposed DAC approach is applied to a massive dataset from the Chinese Longitudinal Healthy Longevity Survey.
期刊介绍:
The Canadian Journal of Statistics is the official journal of the Statistical Society of Canada. It has a reputation internationally as an excellent journal. The editorial board is comprised of statistical scientists with applied, computational, methodological, theoretical and probabilistic interests. Their role is to ensure that the journal continues to provide an international forum for the discipline of Statistics.
The journal seeks papers making broad points of interest to many readers, whereas papers making important points of more specific interest are better placed in more specialized journals. The levels of innovation and impact are key in the evaluation of submitted manuscripts.