{"title":"Huber Principal Component Analysis for large-dimensional factor models","authors":"Yong He , Lingxiao Li , Dong Liu , Wen-Xin Zhou","doi":"10.1016/j.jeconom.2025.105993","DOIUrl":"10.1016/j.jeconom.2025.105993","url":null,"abstract":"<div><div>Factor models have been widely used in economics and finance. However, the heavy-tailed nature of macroeconomic and financial data is often neglected in statistical analysis. To address this issue, we propose a robust approach to estimate factor loadings and scores by minimizing the Huber loss function, which is motivated by the equivalence between conventional Principal Component Analysis (PCA) and the constrained least squares method in the factor model. We provide two algorithms that use different penalty forms. The first algorithm involves an element-wise-type Huber loss minimization, solved by an iterative Huber regression algorithm. The second algorithm, which we refer to as Huber PCA, minimizes the <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>2</mn></mrow></msub></math></span>-norm-type Huber loss and performs PCA on the weighted sample covariance matrix. We examine the theoretical minimizer of the element-wise Huber loss function and demonstrate that it has the same convergence rate as conventional PCA when the idiosyncratic errors have bounded second moments. We also derive their asymptotic distributions under mild conditions. Moreover, we suggest a consistent model selection criterion that relies on rank minimization to estimate the number of factors robustly. We showcase the benefits of the proposed two algorithms through extensive numerical experiments and a real macroeconomic data example. An <span>R</span> package named “<span>HDRFA</span>” <span><span><sup>1</sup></span></span> has been developed to conduct the proposed robust factor analysis.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105993"},"PeriodicalIF":9.9,"publicationDate":"2025-03-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143642831","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Limit theory and inference in non-cointegrated functional coefficient regression","authors":"Ying Wang , Peter C.B. Phillips , Yundong Tu","doi":"10.1016/j.jeconom.2025.105996","DOIUrl":"10.1016/j.jeconom.2025.105996","url":null,"abstract":"<div><div>Functional coefficient (FC) cointegrating regressions offer empirical investigators flexibility in modeling economic relationships by introducing covariates that influence the direction and intensity of comovement among nonstationary time series. FC regression models are also useful when formal cointegration is absent, in the sense that the equation errors may themselves be nonstationary, but where the nonstationary series display well-defined FC linkages that can be meaningfully interpreted as correlation measures involving the covariates. The present paper proposes new nonparametric estimators for such FC regression models where the nonstationary series display linkages that enable consistent estimation of the correlation measures between them. Specifically, we develop <span><math><msqrt><mrow><mi>n</mi></mrow></msqrt></math></span>-consistent estimators for the functional coefficient and establish their asymptotic distributions, which involve mixed normal limits that facilitate inference. Two novel features that appear in the limit theory are (i) the need for non-diagonal matrix normalization due to the presence of stationary and nonstationary components in the regression; and (ii) random bias elements that appear in the asymptotic distribution of the kernel estimators, again resulting from the nonstationary regression components. Numerical studies reveal that the proposed estimators achieve significant efficiency improvements compared to the estimators suggested in earlier work by Sun et al. (2011). Easily implementable specification tests with standard chi-square asymptotics are suggested to check for constancy of the functional coefficient. These tests are shown to have faster divergence rate under local alternatives and enjoy superior performance in simulations than tests proposed in Gan et al. (2014). An empirical application based on the quantity theory of money is included, illustrating the practical use of correlated but non-cointegrated regression relations.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105996"},"PeriodicalIF":9.9,"publicationDate":"2025-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143642830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adjustments with many regressors under covariate-adaptive randomizations","authors":"Liang Jiang , Liyao Li , Ke Miao , Yichong Zhang","doi":"10.1016/j.jeconom.2025.105991","DOIUrl":"10.1016/j.jeconom.2025.105991","url":null,"abstract":"<div><div>Our paper discovers a new trade-off of using regression adjustments (RAs) in causal inference under covariate-adaptive randomizations (CARs). On one hand, RAs can improve the efficiency of causal estimators by incorporating information from covariates that are not used in the randomization. On the other hand, RAs can degrade estimation efficiency due to their estimation errors, which are not asymptotically negligible when the number of regressors is of the same order as the sample size. Ignoring the estimation errors of RAs may result in serious over-rejection of causal inference under the null hypothesis. To address the issue, we construct a new ATE estimator by optimally linearly combining the estimators with and without RAs. We then develop a unified inference theory for this estimator under CARs. It has two features: (1) the Wald test based on it achieves the exact asymptotic size under the null hypothesis, regardless of whether the number of covariates is fixed or diverges no faster than the sample size; and (2) it guarantees weak efficiency improvement over estimators both with and without RAs.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105991"},"PeriodicalIF":9.9,"publicationDate":"2025-03-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143629093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Philipp Bach , Sven Klaassen , Jannis Kueck , Martin Spindler
{"title":"Estimation and uniform inference in sparse high-dimensional additive models","authors":"Philipp Bach , Sven Klaassen , Jannis Kueck , Martin Spindler","doi":"10.1016/j.jeconom.2025.105973","DOIUrl":"10.1016/j.jeconom.2025.105973","url":null,"abstract":"<div><div>We develop a novel method to construct uniformly valid confidence bands for a nonparametric component <span><math><msub><mrow><mi>f</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span> in the sparse additive model <span><math><mrow><mi>Y</mi><mo>=</mo><msub><mrow><mi>f</mi></mrow><mrow><mn>1</mn></mrow></msub><mrow><mo>(</mo><msub><mrow><mi>X</mi></mrow><mrow><mn>1</mn></mrow></msub><mo>)</mo></mrow><mo>+</mo><mo>…</mo><mo>+</mo><msub><mrow><mi>f</mi></mrow><mrow><mi>p</mi></mrow></msub><mrow><mo>(</mo><msub><mrow><mi>X</mi></mrow><mrow><mi>p</mi></mrow></msub><mo>)</mo></mrow><mo>+</mo><mi>ɛ</mi></mrow></math></span> in a high-dimensional setting. Our method integrates sieve estimation into a high-dimensional Z-estimation framework, facilitating the construction of uniformly valid confidence bands for the target component <span><math><msub><mrow><mi>f</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span>. To form these confidence bands, we employ a multiplier bootstrap procedure. Additionally, we provide rates for the uniform lasso estimation in high dimensions, which may be of independent interest. Through simulation studies, we demonstrate that our proposed method delivers reliable results in terms of estimation and coverage, even in small samples.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105973"},"PeriodicalIF":9.9,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143551236","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bootstrap based asymptotic refinements for high-dimensional nonlinear models","authors":"Joel L. Horowitz , Ahnaf Rafi","doi":"10.1016/j.jeconom.2025.105977","DOIUrl":"10.1016/j.jeconom.2025.105977","url":null,"abstract":"<div><div>We consider penalized extremum estimation of a high-dimensional, possibly nonlinear model that is sparse in the sense that most of its parameters are zero but some are not. We use the SCAD penalty function, which provides model selection consistent and oracle efficient estimates under suitable conditions. However, asymptotic approximations based on the oracle model can be inaccurate with the sample sizes found in many applications. This paper gives conditions under which the bootstrap, based on estimates obtained through SCAD penalization with thresholding, provides asymptotic refinements of size <span><math><mrow><mi>O</mi><mo>(</mo><msup><mrow><mi>n</mi></mrow><mrow><mo>−</mo><mn>2</mn></mrow></msup><mo>)</mo></mrow></math></span> for the error in the rejection (coverage) probability of a symmetric hypothesis test (confidence interval) and <span><math><mrow><mi>O</mi><mo>(</mo><msup><mrow><mi>n</mi></mrow><mrow><mo>−</mo><mn>1</mn></mrow></msup><mo>)</mo></mrow></math></span> for the error in the rejection (coverage) probability of a one-sided or equal tailed test (confidence interval). The results of Monte Carlo experiments show that the bootstrap can provide large reductions in errors in rejection and coverage probabilities. The bootstrap is consistent, though it does not necessarily provide asymptotic refinements, if some parameters are close but not equal to zero. Random-coefficients logit and probit models and nonlinear moment models are examples of models to which the procedure applies.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"249 ","pages":"Article 105977"},"PeriodicalIF":9.9,"publicationDate":"2025-03-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143551233","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Score-type tests for normal mixtures","authors":"Dante Amengual , Xinyue Bei , Marine Carrasco , Enrique Sentana","doi":"10.1016/j.jeconom.2024.105717","DOIUrl":"10.1016/j.jeconom.2024.105717","url":null,"abstract":"<div><div>Testing normality against discrete normal mixtures is complex because some parameters turn increasingly underidentified along alternative ways of approaching the null, others are inequality constrained, and several higher-order derivatives become identically 0. These problems make the maximum of the alternative model log-likelihood function numerically unreliable. We propose score-type tests asymptotically equivalent to the likelihood ratio as the largest of two simple intuitive statistics that only require estimation under the null. One novelty of our approach is that we treat symmetrically both ways of writing the null hypothesis without excluding any region of the parameter space. We derive the asymptotic distribution of our tests under the null and sequences of local alternatives. We also show that their asymptotic distribution is the same whether applied to observations or standardized residuals from heteroskedastic regression models. Finally, we study their power in simulations and apply them to the residuals of Mincer earnings functions.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105717"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140154183","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christophe Bellégo , David Benatia , Vincent Dortet-Bernadet
{"title":"The chained difference-in-differences","authors":"Christophe Bellégo , David Benatia , Vincent Dortet-Bernadet","doi":"10.1016/j.jeconom.2024.105783","DOIUrl":"10.1016/j.jeconom.2024.105783","url":null,"abstract":"<div><div>This paper studies the identification, estimation, and inference of long-term (binary) treatment effect parameters when balanced panel data is not available, or consists of only a subset of the available data. We develop a new estimator: the chained difference-in-differences, which leverages the overlapping structure of many unbalanced panel data sets. This approach consists in aggregating a collection of short-term treatment effects estimated on multiple incomplete panels. Our estimator accommodates (1) multiple time periods, (2) variation in treatment timing, (3) treatment effect heterogeneity, (4) general missing data patterns, and (5) sample selection on observables. We establish the asymptotic properties of the proposed estimator and discuss identification and efficiency gains in comparison to existing methods. Finally, we illustrate its relevance through (i) numerical simulations, and (ii) an application about the effects of an innovation policy in France.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105783"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"143526861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The term structure of macroeconomic risks at the effective lower bound","authors":"Guillaume Roussellet","doi":"10.1016/j.jeconom.2023.01.005","DOIUrl":"10.1016/j.jeconom.2023.01.005","url":null,"abstract":"<div><div>This paper proposes a new macro-finance model that solves the tension between tractability, flexibility in macroeconomic<span><span><span> dynamics, and consistency of the term structures of treasury yields with the effective lower bound (ELB). I use the term structures of U.S. nominal and real treasury yields from 1990 to explore the interdependence between </span>inflation expectations, volatility, and </span>monetary policy<span> at the ELB. The estimation reveals that real yields stay elevated during the ELB due to large premia and deflation fears, produced by a persistent shift in inflation<span> dynamics, with low average inflation and heightened inflation volatility.</span></span></span></div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105383"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"45139733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Regularizing stock return covariance matrices via multiple testing of correlations","authors":"Richard Luger","doi":"10.1016/j.jeconom.2024.105753","DOIUrl":"10.1016/j.jeconom.2024.105753","url":null,"abstract":"<div><div>This paper develops a large-scale inference approach for the regularization of stock return covariance matrices. The framework allows for the presence of heavy tails and multivariate GARCH-type effects of unknown form among the stock returns. The approach involves simultaneous testing of all pairwise correlations, followed by setting non-statistically significant elements to zero. This adaptive thresholding is achieved through sign-based Monte Carlo resampling within multiple testing procedures, controlling either the traditional familywise error rate, a generalized familywise error rate, or the false discovery proportion. Subsequent shrinkage ensures that the final covariance matrix estimate is positive definite and well-conditioned while preserving the achieved sparsity. Compared to alternative estimators, this new regularization method demonstrates strong performance in simulation experiments and real portfolio optimization.</div></div>","PeriodicalId":15629,"journal":{"name":"Journal of Econometrics","volume":"248 ","pages":"Article 105753"},"PeriodicalIF":9.9,"publicationDate":"2025-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141056593","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}