{"title":"Simple robust two-stage estimation and inference for generalized impulse responses and multi-horizon causality","authors":"Jean-Marie Dufour, Endong Wang","doi":"arxiv-2409.10820","DOIUrl":"https://doi.org/arxiv-2409.10820","url":null,"abstract":"This paper introduces a novel two-stage estimation and inference procedure\u0000for generalized impulse responses (GIRs). GIRs encompass all coefficients in a\u0000multi-horizon linear projection model of future outcomes of y on lagged values\u0000(Dufour and Renault, 1998), which include the Sims' impulse response. The\u0000conventional use of Least Squares (LS) with heteroskedasticity- and\u0000autocorrelation-consistent covariance estimation is less precise and often\u0000results in unreliable finite sample tests, further complicated by the selection\u0000of bandwidth and kernel functions. Our two-stage method surpasses the LS\u0000approach in terms of estimation efficiency and inference robustness. The\u0000robustness stems from our proposed covariance matrix estimates, which eliminate\u0000the need to correct for serial correlation in the multi-horizon projection\u0000residuals. Our method accommodates non-stationary data and allows the\u0000projection horizon to grow with sample size. Monte Carlo simulations\u0000demonstrate our two-stage method outperforms the LS method. We apply the\u0000two-stage method to investigate the GIRs, implement multi-horizon Granger\u0000causality test, and find that economic uncertainty exerts both short-run (1-3\u0000months) and long-run (30 months) effects on economic activities.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"18 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260943","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Simple and Adaptive Confidence Interval when Nuisance Parameters Satisfy an Inequality","authors":"Gregory Fletcher Cox","doi":"arxiv-2409.09962","DOIUrl":"https://doi.org/arxiv-2409.09962","url":null,"abstract":"Inequalities may appear in many models. They can be as simple as assuming a\u0000parameter is nonnegative, possibly a regression coefficient or a treatment\u0000effect. This paper focuses on the case that there is only one inequality and\u0000proposes a confidence interval that is particularly attractive, called the\u0000inequality-imposed confidence interval (IICI). The IICI is simple. It does not\u0000require simulations or tuning parameters. The IICI is adaptive. It reduces to\u0000the usual confidence interval (calculated by adding and subtracting the\u0000standard error times the $1 - alpha/2$ standard normal quantile) when the\u0000inequality is sufficiently slack. When the inequality is sufficiently violated,\u0000the IICI reduces to an equality-imposed confidence interval (the usual\u0000confidence interval for the submodel where the inequality holds with equality).\u0000Also, the IICI is uniformly valid and has (weakly) shorter length than the\u0000usual confidence interval; it is never longer. The first empirical application\u0000considers a linear regression when a coefficient is known to be nonpositive. A\u0000second empirical application considers an instrumental variables regression\u0000when the endogeneity of a regressor is known to be nonnegative.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"17 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thiago Trafane Oliveira SantosCentral Bank of Brazil, Brasília, Brazil. Department of %Economics, University of Brasilia, Brazil, Daniel Oliveira CajueiroDepartment of Economics, University of Brasilia, Brazil. National Institute of Science and Technology for Complex Systems
{"title":"Why you should also use OLS estimation of tail exponents","authors":"Thiago Trafane Oliveira SantosCentral Bank of Brazil, Brasília, Brazil. Department of %Economics, University of Brasilia, Brazil, Daniel Oliveira CajueiroDepartment of Economics, University of Brasilia, Brazil. National Institute of Science and Technology for Complex Systems","doi":"arxiv-2409.10448","DOIUrl":"https://doi.org/arxiv-2409.10448","url":null,"abstract":"Even though practitioners often estimate Pareto exponents running OLS\u0000rank-size regressions, the usual recommendation is to use the Hill MLE with a\u0000small-sample correction instead, due to its unbiasedness and efficiency. In\u0000this paper, we advocate that you should also apply OLS in empirical\u0000applications. On the one hand, we demonstrate that, with a small-sample\u0000correction, the OLS estimator is also unbiased. On the other hand, we show that\u0000the MLE assigns significantly greater weight to smaller observations. This\u0000suggests that the OLS estimator may outperform the MLE in cases where the\u0000distribution is (i) strictly Pareto but only in the upper tail or (ii)\u0000regularly varying rather than strictly Pareto. We substantiate our theoretical\u0000findings with Monte Carlo simulations and real-world applications,\u0000demonstrating the practical relevance of the OLS method in estimating tail\u0000exponents.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"29 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142261052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"GPT takes the SAT: Tracing changes in Test Difficulty and Math Performance of Students","authors":"Vikram Krishnaveti, Saannidhya Rawat","doi":"arxiv-2409.10750","DOIUrl":"https://doi.org/arxiv-2409.10750","url":null,"abstract":"Scholastic Aptitude Test (SAT) is crucial for college admissions but its\u0000effectiveness and relevance are increasingly questioned. This paper enhances\u0000Synthetic Control methods by introducing \"Transformed Control\", a novel method\u0000that employs Large Language Models (LLMs) powered by Artificial Intelligence to\u0000generate control groups. We utilize OpenAI's API to generate a control group\u0000where GPT-4, or ChatGPT, takes multiple SATs annually from 2008 to 2023. This\u0000control group helps analyze shifts in SAT math difficulty over time, starting\u0000from the baseline year of 2008. Using parallel trends, we calculate the Average\u0000Difference in Scores (ADS) to assess changes in high school students' math\u0000performance. Our results indicate a significant decrease in the difficulty of\u0000the SAT math section over time, alongside a decline in students' math\u0000performance. The analysis shows a 71-point drop in the rigor of SAT math from\u00002008 to 2023, with student performance decreasing by 36 points, resulting in a\u0000107-point total divergence in average student math performance. We investigate\u0000possible mechanisms for this decline in math proficiency, such as changing\u0000university selection criteria, increased screen time, grade inflation, and\u0000worsening adolescent mental health. Disparities among demographic groups show a\u0000104-point drop for White students, 84 points for Black students, and 53 points\u0000for Asian students. Male students saw a 117-point reduction, while female\u0000students had a 100-point decrease.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"195 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142260944","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"On LASSO Inference for High Dimensional Predictive Regression","authors":"Zhan Gao, Ji Hyung Lee, Ziwei Mei, Zhentao Shi","doi":"arxiv-2409.10030","DOIUrl":"https://doi.org/arxiv-2409.10030","url":null,"abstract":"LASSO introduces shrinkage bias into estimated coefficients, which can\u0000adversely affect the desirable asymptotic normality and invalidate the standard\u0000inferential procedure based on the $t$-statistic. The desparsified LASSO has\u0000emerged as a well-known remedy for this issue. In the context of high\u0000dimensional predictive regression, the desparsified LASSO faces an additional\u0000challenge: the Stambaugh bias arising from nonstationary regressors. To restore\u0000the standard inferential procedure, we propose a novel estimator called\u0000IVX-desparsified LASSO (XDlasso). XDlasso eliminates the shrinkage bias and the\u0000Stambaugh bias simultaneously and does not require prior knowledge about the\u0000identities of nonstationary and stationary regressors. We establish the\u0000asymptotic properties of XDlasso for hypothesis testing, and our theoretical\u0000findings are supported by Monte Carlo simulations. Applying our method to\u0000real-world applications from the FRED-MD database -- which includes a rich set\u0000of control variables -- we investigate two important empirical questions: (i)\u0000the predictability of the U.S. stock returns based on the earnings-price ratio,\u0000and (ii) the predictability of the U.S. inflation using the unemployment rate.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"26 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142261057","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Estimating Wage Disparities Using Foundation Models","authors":"Keyon Vafa, Susan Athey, David M. Blei","doi":"arxiv-2409.09894","DOIUrl":"https://doi.org/arxiv-2409.09894","url":null,"abstract":"One thread of empirical work in social science focuses on decomposing group\u0000differences in outcomes into unexplained components and components explained by\u0000observable factors. In this paper, we study gender wage decompositions, which\u0000require estimating the portion of the gender wage gap explained by career\u0000histories of workers. Classical methods for decomposing the wage gap employ\u0000simple predictive models of wages which condition on a small set of simple\u0000summaries of labor history. The problem is that these predictive models cannot\u0000take advantage of the full complexity of a worker's history, and the resulting\u0000decompositions thus suffer from omitted variable bias (OVB), where covariates\u0000that are correlated with both gender and wages are not included in the model.\u0000Here we explore an alternative methodology for wage gap decomposition that\u0000employs powerful foundation models, such as large language models, as the\u0000predictive engine. Foundation models excel at making accurate predictions from\u0000complex, high-dimensional inputs. We use a custom-built foundation model,\u0000designed to predict wages from full labor histories, to decompose the gender\u0000wage gap. We prove that the way such models are usually trained might still\u0000lead to OVB, but develop fine-tuning algorithms that empirically mitigate this\u0000issue. Our model captures a richer representation of career history than simple\u0000models and predicts wages more accurately. In detail, we first provide a novel\u0000set of conditions under which an estimator of the wage gap based on a\u0000fine-tuned foundation model is $sqrt{n}$-consistent. Building on the theory,\u0000we then propose methods for fine-tuning foundation models that minimize OVB.\u0000Using data from the Panel Study of Income Dynamics, we find that history\u0000explains more of the gender wage gap than standard econometric models can\u0000measure, and we identify elements of history that are important for reducing\u0000OVB.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"49 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142261053","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Structural counterfactual analysis in macroeconomics: theory and inference","authors":"Endong Wang","doi":"arxiv-2409.09577","DOIUrl":"https://doi.org/arxiv-2409.09577","url":null,"abstract":"We propose a structural model-free methodology to analyze two types of\u0000macroeconomic counterfactuals related to policy path deviation: hypothetical\u0000trajectory and policy intervention. Our model-free approach is built on a\u0000structural vector moving-average (SVMA) model that relies solely on the\u0000identification of policy shocks, thereby eliminating the need to specify an\u0000entire structural model. Analytical solutions are derived for the\u0000counterfactual parameters, and statistical inference for these parameter\u0000estimates is provided using the Delta method. By utilizing external\u0000instruments, we introduce a projection-based method for the identification,\u0000estimation, and inference of these parameters. This approach connects our\u0000counterfactual analysis with the Local Projection literature. A\u0000simulation-based approach with nonlinear model is provided to add in addressing\u0000Lucas' critique. The innovative model-free methodology is applied in three\u0000counterfactual studies on the U.S. monetary policy: (1) a historical scenario\u0000analysis for a hypothetical interest rate path in the post-pandemic era, (2) a\u0000future scenario analysis under either hawkish or dovish interest rate policy,\u0000and (3) an evaluation of the policy intervention effect of an oil price shock\u0000by zeroing out the systematic responses of the interest rate.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"26 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142261050","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Unconditional Randomization Tests for Interference","authors":"Liang Zhong","doi":"arxiv-2409.09243","DOIUrl":"https://doi.org/arxiv-2409.09243","url":null,"abstract":"In social networks or spatial experiments, one unit's outcome often depends\u0000on another's treatment, a phenomenon called interference. Researchers are\u0000interested in not only the presence and magnitude of interference but also its\u0000pattern based on factors like distance, neighboring units, and connection\u0000strength. However, the non-random nature of these factors and complex\u0000correlations across units pose challenges for inference. This paper introduces\u0000the partial null randomization tests (PNRT) framework to address these issues.\u0000The proposed method is finite-sample valid and applicable with minimal network\u0000structure assumptions, utilizing randomization testing and pairwise\u0000comparisons. Unlike existing conditional randomization tests, PNRT avoids the\u0000need for conditioning events, making it more straightforward to implement.\u0000Simulations demonstrate the method's desirable power properties and its\u0000applicability to general interference scenarios.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"36 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142261051","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Cerqua Augusto, Di Stefano Roberta, Mattera Raffaele
{"title":"The Clustered Dose-Response Function Estimator for continuous treatment with heterogeneous treatment effects","authors":"Cerqua Augusto, Di Stefano Roberta, Mattera Raffaele","doi":"arxiv-2409.08773","DOIUrl":"https://doi.org/arxiv-2409.08773","url":null,"abstract":"Many treatments are non-randomly assigned, continuous in nature, and exhibit\u0000heterogeneous effects even at identical treatment intensities. Taken together,\u0000these characteristics pose significant challenges for identifying causal\u0000effects, as no existing estimator can provide an unbiased estimate of the\u0000average causal dose-response function. To address this gap, we introduce the\u0000Clustered Dose-Response Function (Cl-DRF), a novel estimator designed to\u0000discern the continuous causal relationships between treatment intensity and the\u0000dependent variable across different subgroups. This approach leverages both\u0000theoretical and data-driven sources of heterogeneity and operates under relaxed\u0000versions of the conditional independence and positivity assumptions, which are\u0000required to be met only within each identified subgroup. To demonstrate the\u0000capabilities of the Cl-DRF estimator, we present both simulation evidence and\u0000an empirical application examining the impact of European Cohesion funds on\u0000economic growth.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"85 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142261054","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Trends and biases in the social cost of carbon","authors":"Richard S. J. Tol","doi":"arxiv-2409.08158","DOIUrl":"https://doi.org/arxiv-2409.08158","url":null,"abstract":"An updated and extended meta-analysis confirms that the central estimate of\u0000the social cost of carbon is around $200/tC with a large, right-skewed\u0000uncertainty and trending up. The pure rate of time preference and the inverse\u0000of the elasticity of intertemporal substitution are key assumptions, the total\u0000impact of 2.5K warming less so. The social cost of carbon is much higher if\u0000climate change is assumed to affect economic growth rather than the level of\u0000output and welfare. The literature is dominated by a relatively small network\u0000of authors, based in a few countries. Publication and citation bias have pushed\u0000the social cost of carbon up.","PeriodicalId":501293,"journal":{"name":"arXiv - ECON - Econometrics","volume":"1566 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142184087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}