{"title":"Stochastic Revealed Preferences with Measurement Error","authors":"Victor H. Aguiar, N. Kashaev","doi":"10.1093/restud/rdaa067","DOIUrl":"https://doi.org/10.1093/restud/rdaa067","url":null,"abstract":"A long-standing question about consumer behavior is whether individuals' observed purchase decisions satisfy the revealed preference (RP) axioms of the utility maximization theory (UMT). Researchers using survey or experimental panel data sets on prices and consumption to answer this question face the well-known problem of measurement error. We show that ignoring measurement error in the RP approach may lead to overrejection of the UMT. To solve this problem, we propose a new statistical RP framework for consumption panel data sets that allows for testing the UMT in the presence of measurement error. Our test is applicable to all consumer models that can be characterized by their first-order conditions. Our approach is nonparametric, allows for unrestricted heterogeneity in preferences, and requires only a centering condition on measurement error. We develop two applications that provide new evidence about the UMT. First, we find support in a survey data set for the dynamic and time-consistent UMT in single-individual households, in the presence of emph{nonclassical} measurement error in consumption. In the second application, we cannot reject the static UMT in a widely used experimental data set in which measurement error in prices is assumed to be the result of price misperception due to the experimental design. The first finding stands in contrast to the conclusions drawn from the deterministic RP test of Browning (1989). The second finding reverses the conclusions drawn from the deterministic RP test of Afriat (1967) and Varian (1982).","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87677311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Valid simultaneous inference in high-dimensional settings (with the HDM package for R)","authors":"Philipp Bach, V. Chernozhukov, M. Spindler","doi":"10.1920/WP.CEM.2019.3019","DOIUrl":"https://doi.org/10.1920/WP.CEM.2019.3019","url":null,"abstract":"Due to the increasing availability of high-dimensional empirical applications in many research disciplines, valid simultaneous inference becomes more and more important. For instance, high-dimensional settings might arise in economic studies due to very rich data sets with many potential covariates or in the analysis of treatment heterogeneities. Also the evaluation of potentially more complicated (non-linear) functional forms of the regression relationship leads to many potential variables for which simultaneous inferential statements might be of interest. Here we provide a review of classical and modern methods for simultaneous inference in (high-dimensional) settings and illustrate their use by a case study using the R package hdm. The R package hdm implements valid joint powerful and efficient hypothesis tests for a potentially large number of coeffcients as well as the construction of simultaneous confidence intervals and, therefore, provides useful methods to perform valid post-selection inference based on the LASSO.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"58 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90340102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Moment Inequalities in the Context of Simulated and Predicted Variables","authors":"Hiroaki Kaido, Jiaxuan Li, Marc Rysman","doi":"10.1920/WPM.CEM.2018.2618","DOIUrl":"https://doi.org/10.1920/WPM.CEM.2018.2618","url":null,"abstract":"This paper explores the effects of simulated moments on the performance of inference methods based on moment inequalities. Commonly used confi dence sets for parameters are level sets of criterion functions whose boundary points may depend on sample moments in an irregular manner. Due to this feature, simulation errors can affect the performance of inference in non-standard ways. In particular, a (fi rst-order) bias due to the simulation errors may remain in the estimated boundary of the con fidence set. We demonstrate, through Monte Carlo experiments, that simulation errors can signi ficantly reduce the coverage probabilities of confi dence sets in small samples. The size distortion is particularly severe when the number of inequality restrictions is large. These results highlight the danger of ignoring the sampling variations due to the simulation errors in moment inequality models. Similar issues arise when using predicted variables in moment inequalities models. We propose a method for properly correcting for these variations based on regularizing the intersection of moments in parameter space, and we show that our proposed method performs well theoretically and in practice.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90893035","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Simultaneous Mean-Variance Regression","authors":"R. Spady, S. Stouli","doi":"10.1920/WP.CEM.2018.2518","DOIUrl":"https://doi.org/10.1920/WP.CEM.2018.2518","url":null,"abstract":"We propose simultaneous mean-variance regression for the linear estimation and approximation of conditional mean functions. In the presence of heteroskedasticity of unknown form, our method accounts for varying dispersion in the regression outcome across the support of conditioning variables by using weights that are jointly determined with the mean regression parameters. Simultaneity generates outcome predictions that are guaranteed to improve over ordinary least-squares prediction error, with corresponding parameter standard errors that are automatically valid. Under shape misspecification of the conditional mean and variance functions, we establish existence and uniqueness of the resulting approximations and characterize their formal interpretation and robustness properties. In particular, we show that the corresponding mean-variance regression location-scale model weakly dominates the ordinary least-squares location model under a Kullback-Leibler measure of divergence, with strict improvement in the presence of heteroskedasticity. The simultaneous mean-variance regression loss function is globally convex and the corresponding estimator is easy to implement. We establish its consistency and asymptotic normality under misspecification, provide robust inference methods, and present numerical simulations that show large improvements over ordinary and weighted least-squares in terms of estimation and inference in finite samples. We further illustrate our method with two empirical applications to the estimation of the relationship between economic prosperity in 1500 and today, and demand for gasoline in the United States.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"43 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-04-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"79361028","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Role of Agricultural Sector Productivity in Economic Growth: The Case of Iran's Economic Development Plan","authors":"M. Tahamipour, Mina Mahmoudi","doi":"10.5296/RAE.V10I1.12809","DOIUrl":"https://doi.org/10.5296/RAE.V10I1.12809","url":null,"abstract":"This study provides the theoretical framework and empirical model for productivity growth evaluations in agricultural sector as one of the most important sectors in Iran's economic development plan. We use the Solow residual model to measure the productivity growth share in the value-added growth of the agricultural sector. Our time series data includes value-added per worker, employment, and capital in this sector. The results show that the average total factor productivity growth rate in the agricultural sector is -0.72% during 1991-2010. Also, during this period, the share of total factor productivity growth in the value-added growth is -19.6%, while it has been forecasted to be 33.8% in the fourth development plan. Considering the effective role of capital in the agricultural low productivity, we suggest applying productivity management plans (especially in regards of capital productivity) to achieve future growth goals.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"13 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86630531","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Permutation tests for equality of distributions of functional data","authors":"Federico A. Bugni, J. Horowitz","doi":"10.1920/WP.CEM.2018.1818","DOIUrl":"https://doi.org/10.1920/WP.CEM.2018.1818","url":null,"abstract":"Economic data are often generated by stochastic processes that take place in continuous time, though observations may occur only at discrete times. For example, electricity and gas consumption take place in continuous time. Data generated by a continuous time stochastic process are called functional data. This paper is concerned with comparing two or more stochastic processes that generate functional data. The data may be produced by a randomized experiment in which there are multiple treatments. The paper presents a test of the hypothesis that the same stochastic process generates all the functional data. In contrast to existing methods, the test described here applies to both functional data and multiple treatments. The test is presented as a permutation test, which ensures that in a finite sample, the true and nominal probabilities of rejecting a correct null hypothesis are equal. The paper also presents the asymptotic distribution of the test statistic under alternative hypotheses. The results of Monte Carlo experiments and an application to an experiment on billing and pricing of natural gas illustrate the usefulness of the test.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"1 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-03-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89440243","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Strategy of the remove and easy TBT in GCC6 countries","authors":"Yongjae Kim","doi":"10.5121/csit.2018.80404","DOIUrl":"https://doi.org/10.5121/csit.2018.80404","url":null,"abstract":"The last technical barriers to trade(TBT) between countries are Non-Tariff Barriers(NTBs), meaning all trade barriers are possible other than Tariff Barriers. And the most typical examples are (TBT), which refer to measure Technical Regulation, Standards, Procedure for Conformity Assessment, Test & Certification etc. Therefore, in order to eliminate TBT, WTO has made all membership countries automatically enter into an agreement on TBT. In this study, the elimination strategy of TBT with aid of technical regulations or standards is excluded, and only the conformity assessment shall be considered as the strategic measure of eliminating TBT in GCC(Gulf Cooperation Council) 6 countries. The measure for every membership country to accord with the international standards corresponding to their technical regulations and standards, is only to present TBT related Specific Trade Concern(STC) to WTO. However, each of countries retains its own conformity assessment area, and measures to settle such differences are various as well. Therefore, it is likely required an appropriate level of harmonization in them to carry forward this scheme. KTC(Korea Testing Certification) written MRA with GCC test & certification company in 2015 years. So Korea exporting company can export to GCC goods with attached test & certification documents in Korea. To conclude, it is suggest MRA for the remove and reduce TBT to increse export and import among countries.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"39 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-02-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90484772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Christian Hotz-Behofsits, Florian Huber, Thomas O. Zorner
{"title":"Predicting crypto-currencies using sparse non-Gaussian state space models","authors":"Christian Hotz-Behofsits, Florian Huber, Thomas O. Zorner","doi":"10.1002/FOR.2524","DOIUrl":"https://doi.org/10.1002/FOR.2524","url":null,"abstract":"In this paper we forecast daily returns of crypto-currencies using a wide variety of different econometric models. To capture salient features commonly observed in financial time series like rapid changes in the conditional variance, non-normality of the measurement errors and sharply increasing trends, we develop a time-varying parameter VAR with t-distributed measurement errors and stochastic volatility. To control for overparameterization, we rely on the Bayesian literature on shrinkage priors that enables us to shrink coefficients associated with irrelevant predictors and/or perform model specification in a flexible manner. Using around one year of daily data we perform a real-time forecasting exercise and investigate whether any of the proposed models is able to outperform the naive random walk benchmark. To assess the economic relevance of the forecasting gains produced by the proposed models we moreover run a simple trading exercise.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"8 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2018-01-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87891642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A comment on 'Testing Goodwin: growth cycles in ten OECD countries'","authors":"M. Grasselli, Aditya Maheshwari","doi":"10.1093/cje/bex018","DOIUrl":"https://doi.org/10.1093/cje/bex018","url":null,"abstract":"We revisit the results of Harvie (2000) and show how correcting for a reporting mistake in some of the estimated parameter values leads to significantly different conclusions, including realistic parameter values for the Philips curve and estimated equilibrium employment rates exhibiting on average one tenth of the relative error of those obtained in Harvie (2000).","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"54 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89038124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Non-asymptotic inference in instrumental variables estimation","authors":"J. Horowitz","doi":"10.1920/WP.CEM.2017.4617","DOIUrl":"https://doi.org/10.1920/WP.CEM.2017.4617","url":null,"abstract":"This paper presents a simple method for carrying out inference in a wide variety of possibly nonlinear IV models under weak assumptions. The method is non-asymptotic in the sense that it provides a finite sample bound on the difference between the true and nominal probabilities of rejecting a correct null hypothesis. The method is a non-Studentized version of the Anderson-Rubin test but is motivated and analyzed differently. In contrast to the conventional Anderson-Rubin test, the method proposed here does not require restrictive distributional assumptions, linearity of the estimated model, or simultaneous equations. Nor does it require knowledge of whether the instruments are strong or weak. It does not require testing or estimating the strength of the instruments. The method can be applied to quantile IV models that may be nonlinear and can be used to test a parametric IV model against a nonparametric alternative. The results presented here hold in finite samples, regardless of the strength of the instruments.","PeriodicalId":8448,"journal":{"name":"arXiv: Econometrics","volume":"51 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2017-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"90269031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}