{"title":"Forecasting Using Robust Exponential Smoothing with Damped Trend and Seasonal Components","authors":"Ruben Crevits, C. Croux","doi":"10.2139/ssrn.3068634","DOIUrl":"https://doi.org/10.2139/ssrn.3068634","url":null,"abstract":"We provide a framework for robust exponential smoothing. For a class of exponential smoothing variants, we present a robust alternative. The class includes models with a damped trend and/or seasonal components. We provide robust forecasting equations, robust starting values, robust smoothing parameter estimation and a robust information criterion. The method is implemented in the R package robets, allowing for automatic forecasting. We compare the standard non-robust version with the robust alternative in a simulation study. Finally, the methodology is tested on data.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125213705","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cyclical Dispersion in Expected Defaults","authors":"João F. Gomes, M. Grotteria, Jessica A. Wachter","doi":"10.2139/ssrn.2949447","DOIUrl":"https://doi.org/10.2139/ssrn.2949447","url":null,"abstract":"A growing literature shows that credit indicators forecast aggregate real outcomes. While researchers have proposed various explanations, the economic mechanism behind these results remains an open question. In this paper, we show that a simple, frictionless model explains empirical findings commonly attributed to credit cycles. Our key assumption is that firms have heterogeneous exposures to underlying economy-wide shocks. This leads to endogenous dispersion in credit quality that varies over time and predicts future excess returns and real outcomes.Received August 7, 2017; editorial decision June 26, 2018 by Editor Stijn Van Nieuwerburgh. Authors have furnished an Internet Appendix, which is available on the Oxford University Press Web site next to the link to the final published paper online.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"94 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130968861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Accounting for Volatility Decay in Time Series Models for Leveraged Exchange Traded Funds","authors":"A. Abdou","doi":"10.2139/ssrn.2980208","DOIUrl":"https://doi.org/10.2139/ssrn.2980208","url":null,"abstract":"Leverage Exchange Traded Funds (LETF's) returns tend to deviate from their underlying assets' multiple returns as their holding period increase, a phenomenon known as volatility decay. Algebraically, it is shown that volatility decay is intensified for inverse leveraged funds and as the leverage multiplier increases. The paper uses a novel approach to account for volatility decay. The ARIMA model ability to forecast future returns is tested for three major indexes and is shown to provide more accurate estimates for S&P500. The returns of S&P500 and its corresponding LETFs are fitted to an Autoregressive Integrated Moving Average (ARIMA) model. Theoretically, the constant of the ARIMA model and the variance of their Gaussian errors captures the volatility decay effect. Generalized Autoregressive Conditional Heteroskedasticity (GARCH) models provide more flexibility in modeling conditional variance that is non-stationary. The theoretical results are verified empirically, and the constant of the fitted model captures the intensity of the decay and its direction.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127531462","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bayesian Compressed Vector Autoregressions","authors":"G. Koop, Dimitris Korobilis, Davide Pettenuzzo","doi":"10.2139/ssrn.2754241","DOIUrl":"https://doi.org/10.2139/ssrn.2754241","url":null,"abstract":"Macroeconomists are increasingly working with large Vector Autoregressions (VARs) where the number of parameters vastly exceeds the number of observations. Existing approaches either involve prior shrinkage or the use of factor methods. In this paper, we develop an alternative based on ideas from the compressed regression literature. It involves randomly compressing the explanatory variables prior to analysis. A huge dimensional problem is thus turned into a much smaller, more computationally tractable one. Bayesian model averaging can be done over various compressions, attaching greater weight to compressions which forecast well. In a macroeconomic application involving up to 129 variables, we find compressed VAR methods to forecast better than either factor methods or large VAR methods involving prior shrinkage.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129360084","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Surveying Professional Forecasters","authors":"Y. Grushka-Cockayne, K. C. Lichtendahl","doi":"10.2139/ssrn.2975170","DOIUrl":"https://doi.org/10.2139/ssrn.2975170","url":null,"abstract":"This case illustrates how averaging point forecasts harnesses the wisdom of crowds. Students access data from the Survey of Professional Forecasters (SPF) and compare the performance of the crowd (i.e., the average point forecasts) to the average performance of the individual panelists and the best performer from the previous period.The case is intended for use in a class on forecasting, and the instructor can present it in three ways: with all necessary SPF data cleaned and preprocessed in a student spreadsheet (UVA-QA-0805X, provided with the case); with code (also provided in the student spreadsheet) written by the case authors in R, the statistical computing package, as well as a supplementary handout (UVA-QA-0805H, also provided with the case), which walks students through R code, explaining how to clean and analyze the SPF data; or as a team project to be worked on over several days, providing neither the spreadsheet nor the supplement. \u0000Excerpt \u0000UVA-QA-0805 \u0000Rev. Apr. 7, 2014 \u0000SURVEYING PROFESSIONAL FORECASTERS \u0000Since 1981, the Wall Street Journal surveyed economists for point forecasts of economic indicators such as gross domestic product (GDP), inflation, and unemployment. These forecasts were scored and ranked annually based on their accuracy. The top-ranked forecasters were celebrated in a special announcement of the complete ranking on the Journal's website, typically followed by a press release by the top forecasters' employers. \u0000In 2012, the Journal decided for the first time to average all the forecasts for each indicator and present that set of forecasts as an additional panelist. The economists' average forecast, or “the crowd,” was then ranked alongside the actual panelists. \u0000How had the economists' average forecast performed? How did the crowd compare to “chasing the expert”? Among the 49 panelists, it ranked 12th in 2012. Yet Don Leavens and Tim Gill, the top-ranked team in 2011, came in 5th in 2012 (see Exhibits 1 and 2). What did this say about the wisdom of the crowd? \u0000. . .","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125271246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"How to Predict Financial Stress? An Assessment of Markov Switching Models","authors":"Thibaut Duprey, Benjamin Klaus","doi":"10.2866/773816","DOIUrl":"https://doi.org/10.2866/773816","url":null,"abstract":"This paper predicts phases of the financial cycle by using a continuous financial stress measure in a Markov switching framework. The debt service ratio and property market variables signal a transition to a high financial stress regime, while economic sentiment indicators provide signals for a transition to a tranquil state. Whereas the in-sample analysis suggests that these indicators can provide an early warning signal up to several quarters prior to the respective regime change, the out-of-sample findings indicate that most of this performance is owing to the data gathered during the global financial crisis. Comparing the prediction performance with a standard binary early warning model reveals that the Markov switching model is outperforming the vast majority of model specifications for a horizon up to three quarters prior to the onset of financial stress.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121020282","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. C. Lichtendahl, Y. Grushka-Cockayne, V. R. Jose, R. L. Winkler
{"title":"Extremizing and Anti-Extremizing in Bayesian Ensembles of Binary-Event Forecasts","authors":"K. C. Lichtendahl, Y. Grushka-Cockayne, V. R. Jose, R. L. Winkler","doi":"10.2139/ssrn.2940740","DOIUrl":"https://doi.org/10.2139/ssrn.2940740","url":null,"abstract":"Many organizations combine forecasts of probabilities of binary events to support critical business decisions, such as the approval of credit or the recommendation of a drug. To aggregate individual probabilities, we offer a new method based on Bayesian principles that can help identify why and when combined probabilities need to be extremized. Extremizing is typically viewed as shifting the average probability farther from one half; we emphasize that it is more suitable to define extremizing as shifting it farther from the base rate. We introduce the notion of antiextremizing, cases in which it might be beneficial to make average probabilities less extreme. Analytically, we find that our Bayesian ensembles often extremize the average forecast but sometimes antiextremize instead. On several publicly available data sets, we demonstrate that our Bayesian ensemble performs well and antiextremizes anywhere from 18% to 73% of the cases. Antiextremizing is required more often when there is bracketing with respect to the base rate among the probabilities being aggregated than with no bracketing.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"14 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116646714","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The French Nuclear Bet","authors":"Quentin Perrier","doi":"10.2139/ssrn.2947585","DOIUrl":"https://doi.org/10.2139/ssrn.2947585","url":null,"abstract":"Following the first oil crisis, France launched the world’s largest ever nuclear energy program, commissioning 58 new reactors. These reactors are now reaching 40 years of age, the end of their technological lifetime. This places France at an energy policy crossroads: should the reactors be retrofitted or should they be decommissioned? The cost-optimal decision depends on several factors going forward, in particular the expected costs of nuclear energy production, electricity demand levels and carbon prices, all of which are subject to significant uncertainty. To deal with these uncertainties, we apply the Robust Decision Making framework to determine which reactors should be retrofitted. We build an investment and dispatch optimization model, calibrated for France. Then we use it to study 27 retrofit strategies for all combinations of uncertain parameters, which amounts to nearly 3,000 runs. Our analysis produces two robust strategies, which involve shutting down between 7 and 14 of the 14 oldest reactors, while extending the lifetime of all remaining reactors. These strategies provide a hedge against the risks of unexpected increases in retrofit costs, low demand and low carbon price. Our robust strategies differ from the official French government scenarios on the timing and number of reactors suggested to be decommissioned. They provide a timely contribution to the current debate on the extension of lifetime of nuclear plants in France.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-03-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122531873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Graefe, J. Armstrong, Randall J. Jones, Alfred G. Cuzán
{"title":"The 2016 Pollyvote Popular Vote Forecast: A Preliminary Analysis","authors":"A. Graefe, J. Armstrong, Randall J. Jones, Alfred G. Cuzán","doi":"10.2139/SSRN.2884855","DOIUrl":"https://doi.org/10.2139/SSRN.2884855","url":null,"abstract":"We assess how the PollyVote and its components performed in this election compared to the previous six (1992 to 2012). While always predicting that Hillary Clinton would win the popular vote, across the 100 days leading to the election on average the PollyVote overshot the mark by 1.9 percentage points, almost twice the MAE incurred in the previous six elections. This was because this year there was very little bracketing among the components. Citizen forecasts and econometric models performed best this year, while the Iowa Electronic Markets came in last. Across all elections from 1992 to 2016, the PollyVote error is only a little over one percentage point.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-12-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127947793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Forecast Errors and Uncertainty Shocks","authors":"Pratiti Chatterjee, Sylwia Nowak","doi":"10.5089/9781475555523.001.A001","DOIUrl":"https://doi.org/10.5089/9781475555523.001.A001","url":null,"abstract":"Macroeconomic forecasts are persistently too optimistic. This paper finds that common factors related to general uncertainty about U.S. macrofinancial prospects and global demand drive this overoptimism. These common factors matter most for advanced economies and G- 20 countries. The results suggest that an increase in uncertainty-driven overoptimism has dampening effects on next-year real GDP growth rates. This implies that incorporating the common structure governing forecast errors across countries can help improve subsequent forecasts.","PeriodicalId":170198,"journal":{"name":"ERN: Forecasting Techniques (Topic)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132557043","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}