{"title":"Evaluating the Predictive Performance of Composites in PLS Path Modeling","authors":"N. Danks, Soumya Ray, G. Shmueli","doi":"10.2139/ssrn.3055222","DOIUrl":"https://doi.org/10.2139/ssrn.3055222","url":null,"abstract":"Efforts to evaluate predictive performance in Partial Least Squares (PLS) path modeling are making major headway, but have largely focused on the prediction of measurement items. There is still a need to clarify what prediction of constructs might entail. We examine the challenges of measuring predictive power and validity at the construct level. We then propose a technique for overcoming these challenges and provide suitable predictive metrics.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131949488","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Inference for Impulse Responses Under Model Uncertainty","authors":"L. Lieb, Stephan Smeekes","doi":"10.2139/ssrn.2940104","DOIUrl":"https://doi.org/10.2139/ssrn.2940104","url":null,"abstract":"In many macroeconomic applications, confidence intervals for impulse responses are constructed by estimating VAR models in levels - ignoring cointegration rank uncertainty. We investigate the consequences of ignoring this uncertainty. We adapt several methods for handling model uncertainty and highlight their shortcomings. We propose a new method - Weighted-Inference-by-Model-Plausibility (WIMP) - that takes rank uncertainty into account in a data-driven way. In simulations the WIMP outperforms all other methods considered, delivering intervals that are robust to rank uncertainty, yet not overly conservative. We also study potential ramifications of rank uncertainty on applied macroeconomic analysis by re-assessing the effects of fiscal policy shocks.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121924535","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Circumscribing System Dynamics Modeling and Building Confidence in Models a Personal Perspective","authors":"K. Saeed","doi":"10.2139/ssrn.3093080","DOIUrl":"https://doi.org/10.2139/ssrn.3093080","url":null,"abstract":"While there is a consensus among system dynamics scholars that the dichotomous term validity must be replaced by the term confidence for system dynamics models, it is unclear what qualifies as a system dynamics model – a computational instrument for forecasting, or an experimental tool to inform the policy process? And what exactly needs to be done to build confidence in a model? Confidence building process is described in the system dynamics writings at a rather philosophical level that can be used to justify almost any model. The confidence building procedures provided in the text books are sketchy, do not distinguish between forecasting and policy models and do not adequately describe the iterative process subsumed in the various steps of model construction that might yield confidence. Confidence in forecasting models is an article of faith no matter how detailed they might be and how diligent is their calibration. Forecasting models are albeit irrelevant to system dynamics practice, which must focus on policy. This paper revisits the problem of confidence in system dynamics models addressing policy and attempts to carefully describe their qualification and the process that practitioners must follow to arrive at them.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"110 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121798880","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Financial Globalisation, Monetary Policy Spillovers and Macro-Modelling: Tales from 1001 Shocks","authors":"Georgios P. Georgiadis, Martina Jančoková","doi":"10.24149/gwp314","DOIUrl":"https://doi.org/10.24149/gwp314","url":null,"abstract":"Financial globalisation and spillovers have gained immense prominence over the last two decades. Yet, powerful cross-border financial spillover channels have not become a standard element of structural monetary models. Against this background, we hypothesise that New Keynesian DSGE models that do not feature powerful financial spillover channels confound the effects of domestic and foreign disturbances when confronted with the data. We derive predictions from this hypothesis and subject them to data on monetary policy shock estimates for 29 economies obtained from more than 280 monetary models in the literature. Consistent with the predictions from our hypothesis we find: Monetary policy shock estimates obtained from New Keynesian DSGE models that do not account for powerful financial spillover channels are contaminated by a common global component; the contamination is more severe for economies that are more susceptible to financial spillovers in the data; and the shock estimates imply implausibly similar estimates of the global output spillovers from monetary policy in the US and the euro area. None of these findings applies to monetary policy shock estimates obtained from VAR and other statistical models, financial market expectations and the narrative approach.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121504704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Research on the Motion Picture Industry: State of the Art and New Directions Off the Beaten Track Away from Theatrical Release","authors":"Florian Kumb, Reinhard E. Kunz, Gabriele Siegert","doi":"10.1111/joes.12145","DOIUrl":"https://doi.org/10.1111/joes.12145","url":null,"abstract":"The motion picture industry has been subject of extensive academic research over the last decades. However, most scholars focused on the U.S. theatrical motion picture market. The number of research activities regarding even more profitable release windows, such as home video or television, has been substantially lower. Although international distribution is essential for a motion picture project to break even, there has been little significant re-search on most other markets. This paper aims at summarizing the current state of research on the motion picture industry, particularly from marketing and management perspective, revealing research gaps, and proposing recommendations for future research endeavors. Therefore, a three-pillar scheme is developed to systemize previous findings: Research on intraorganizational decision making, contractual relationships between national stakeholders, and international market competition are differentiated. Since these insights are mainly derived from U.S. theatrical exhibition, they can hardly be applied to other markets and exhibition windows. Thus, potential research areas are identified to expand knowledge of posttheatrical and international markets.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125378709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Minimum Contrast Empirical Likelihood Manipulation Testing for Regression Discontinuity Design","authors":"Jun Ma, Hugo Jales, Zhengfei Yu","doi":"10.2139/ssrn.2925682","DOIUrl":"https://doi.org/10.2139/ssrn.2925682","url":null,"abstract":"This paper proposes a simple empirical-likelihood-based inference method for discontinuity in density. In a regression discontinuity design (RDD), the continuity of the density of the assignment variable at the threshold is considered as a “nomanipulation” behavioral assumption, which is a testable implication of an identifying condition for the local treatment effect (LATE). Our approach is based on the first-order conditions obtained from a minimum contrast (MC) problem and complements Otsu et al. (2013)’s method. Our inference procedure has three main advantages. Firstly, it requires only one tuning parameter; secondly, it does not require concentrating out any nuisance parameter and therefore is very easily implementable; thirdly, its delicate second-order properties lead to a simple coverage-error-optimal (CE-optimal) bandwidth selection rule. We propose a data-driven CE-optimal bandwidth selector for use in practice. Results from Monte Carlo simulations are presented. Usefulness of our method is illustrated by empirical examples.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132647683","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Standardized Method for the Evaluation of Adherence to Practice Guidelines","authors":"Stephanie Thomas","doi":"10.2139/ssrn.2924711","DOIUrl":"https://doi.org/10.2139/ssrn.2924711","url":null,"abstract":"Practice guidelines are widely used in medical settings as a means of improving efficiency and quality of care by aligning service provision with evidence of what is effective. The objective of this work is to propose a methodology for the effective evaluation of the match of clinical practice data with a practice guideline. The proposed methodology uses a combination of existing analytical techniques which minimize the need for the analyst to specify a functional form for the process generating the clinical data. The methodology is illustrated in an application to a set of field data on the supplemental oxygen administration decisions of volunteer medical first responders. The result is a methodology for evaluation of guideline adherence which leverages existing patient care records and is generalizable across clinical contexts. In addition, the results are visually intuitive, supporting communication across diverse audiences.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114485528","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Inter-Quantile Ranges and Volatility of Financial Data","authors":"T. Dimpfl, D. Baur","doi":"10.2139/ssrn.2835951","DOIUrl":"https://doi.org/10.2139/ssrn.2835951","url":null,"abstract":"We propose to estimate the variance of a time series of financial returns through a quantile autoregressive model (QAR) and demonstrate that the return QAR model contains all information that is commonly captured in two separate equations for the mean and variance of a GARCH-type model. In particular, QAR allows to characterize the entire distribution of returns conditional on a positive or negative return of any given size. We show theoretically and in an empirical application that the inter-quantile range spanned by conditional quantile estimates identifies the asymmetric response of volatility to lagged returns, resulting in broader conditional densities for negative returns than for positive returns. Finally, we estimate the conditional variance based on the estimated conditional density and illustrate its accuracy with an evaluation of Value-at-Risk and variance forecasts.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133064143","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Testing a Single Regression Coefficient in High Dimensional Regression Model","authors":"Wei Lan, Pingshou Zhong, Runze Li, Hansheng Wang, Chih-Ling Tsai","doi":"10.2139/ssrn.2783153","DOIUrl":"https://doi.org/10.2139/ssrn.2783153","url":null,"abstract":"In linear regression models with high dimensional data, the classical z-test (or t-test) for testing the significance of each single regression coefficient is no longer applicable. This is mainly because the number of covariates exceeds the sample size. In this paper, we propose a simple and novel alternative by introducing the Correlated Predictors Screening (CPS) method to control for predictors that are highly correlated with the target covariate. Accordingly, the classical ordinary least squares approach can be employed to estimate the regression coefficient associated with the target covariate. In addition, we demonstrate that the resulting estimator is consistent and asymptotically normal even if the random errors are heteroscedastic. This enables us to apply the z-test to assess the significance of each covariate. Based on the p-value obtained from testing the significance of each covariate, we further conduct multiple hypothesis testing by controlling the false discovery rate at the nominal level. Then, we show that the multiple hypothesis testing achieves consistent model selection. Simulation studies and empirical examples are presented to illustrate the finite sample performance and the usefulness of the proposed method, respectively.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126391243","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Consistent Re-Calibration in Yield Curve Modeling: An Example","authors":"Mario V. Wuthrich","doi":"10.2139/ssrn.2630164","DOIUrl":"https://doi.org/10.2139/ssrn.2630164","url":null,"abstract":"Popular yield curve models include affine term structure models. These models are usually based on a fixed set of parameters which is calibrated to the actual financial market conditions. Under changing market conditions also parametrization changes. We discuss how parameters need to be updated with changing market conditions such that the re-calibration meets the premise of being free of arbitrage. We demonstrate this (consistent) re-calibration with the Hull-White extended discrete time Vasicek model at hand, but this concept applies to a wide range of related term structure models.","PeriodicalId":106740,"journal":{"name":"ERN: Other Econometrics: Econometric Model Construction","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124698553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}