{"title":"使密度预测模型在统计上一致","authors":"Michael Carney, P. Cunningham, B. Lucey","doi":"10.2139/ssrn.877629","DOIUrl":null,"url":null,"abstract":"We propose a new approach to density forecast optimisation and apply it to Value-at-Risk estimation. All existing density forecasting models try to optimise the distribution of the returns based solely on the predicted density at the observation. In this paper we argue that probabilistic predictions should be optimised on more than just this accuracy score and suggest that the statistical consistency of the probability estimates should also be optimised during training. Statistical consistency refers to the property that if a predicted density function suggests P percent probability of occurrence, the event truly ought to have probability P of occurring. We describe a quality score that can rank probability density forecasts in terms of statistical consistency based on the probability integral transform (Diebold et al., 1998b). We then describe a framework that can optimise any density forecasting model in terms of any set of objective functions. The framework uses a multi-objective evolutionary algorithm to determine a set of trade-off solutions known as the Pareto front of optimal solutions. Using this framework we develop an algorithm for optimising density forecasting models and implement this algorithm for GARCH (Bollerslev, 1986) and GJR models (Glosten et al., 1993). We call these new models Pareto-GARCH and Pareto-GJR. To determine whether this approach of multi-objective optimisation of density forecasting models produces better results over the standard GARCH and GJR optimisation techniques we compare the models produced empirically on a Value-at-Risk application. Our evaluation shows that our Pareto models produce superior results out-of-sample.","PeriodicalId":149679,"journal":{"name":"Frontiers in Finance & Economics","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Making Density Forecasting Models Statistically Consistent\",\"authors\":\"Michael Carney, P. Cunningham, B. Lucey\",\"doi\":\"10.2139/ssrn.877629\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose a new approach to density forecast optimisation and apply it to Value-at-Risk estimation. All existing density forecasting models try to optimise the distribution of the returns based solely on the predicted density at the observation. In this paper we argue that probabilistic predictions should be optimised on more than just this accuracy score and suggest that the statistical consistency of the probability estimates should also be optimised during training. Statistical consistency refers to the property that if a predicted density function suggests P percent probability of occurrence, the event truly ought to have probability P of occurring. We describe a quality score that can rank probability density forecasts in terms of statistical consistency based on the probability integral transform (Diebold et al., 1998b). We then describe a framework that can optimise any density forecasting model in terms of any set of objective functions. The framework uses a multi-objective evolutionary algorithm to determine a set of trade-off solutions known as the Pareto front of optimal solutions. Using this framework we develop an algorithm for optimising density forecasting models and implement this algorithm for GARCH (Bollerslev, 1986) and GJR models (Glosten et al., 1993). We call these new models Pareto-GARCH and Pareto-GJR. To determine whether this approach of multi-objective optimisation of density forecasting models produces better results over the standard GARCH and GJR optimisation techniques we compare the models produced empirically on a Value-at-Risk application. Our evaluation shows that our Pareto models produce superior results out-of-sample.\",\"PeriodicalId\":149679,\"journal\":{\"name\":\"Frontiers in Finance & Economics\",\"volume\":\"34 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2006-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Finance & Economics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2139/ssrn.877629\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Finance & Economics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.877629","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
摘要
我们提出了一种新的密度预测优化方法,并将其应用于风险价值估计。所有现有的密度预测模型都试图仅根据观测点的预测密度来优化收益的分布。在本文中,我们认为概率预测应该优化的不仅仅是这个准确性分数,并建议概率估计的统计一致性也应该在训练期间优化。统计一致性指的是这样一种性质:如果一个预测的密度函数表明发生的概率为P %,那么该事件确实应该具有发生的概率P。我们描述了一个质量分数,它可以根据概率积分变换的统计一致性对概率密度预测进行排序(Diebold et al., 1998b)。然后,我们描述了一个框架,该框架可以根据任何一组目标函数优化任何密度预测模型。该框架使用多目标进化算法来确定一组权衡解,称为最优解的帕累托前沿。利用这一框架,我们开发了一种优化密度预测模型的算法,并将该算法应用于GARCH (Bollerslev, 1986)和GJR模型(Glosten et al., 1993)。我们称这些新模型为帕累托- garch和帕累托- gjr。为了确定这种密度预测模型的多目标优化方法是否比标准GARCH和GJR优化技术产生更好的结果,我们比较了在风险价值应用上产生的模型。我们的评估表明,我们的帕累托模型产生了更好的样本外结果。
Making Density Forecasting Models Statistically Consistent
We propose a new approach to density forecast optimisation and apply it to Value-at-Risk estimation. All existing density forecasting models try to optimise the distribution of the returns based solely on the predicted density at the observation. In this paper we argue that probabilistic predictions should be optimised on more than just this accuracy score and suggest that the statistical consistency of the probability estimates should also be optimised during training. Statistical consistency refers to the property that if a predicted density function suggests P percent probability of occurrence, the event truly ought to have probability P of occurring. We describe a quality score that can rank probability density forecasts in terms of statistical consistency based on the probability integral transform (Diebold et al., 1998b). We then describe a framework that can optimise any density forecasting model in terms of any set of objective functions. The framework uses a multi-objective evolutionary algorithm to determine a set of trade-off solutions known as the Pareto front of optimal solutions. Using this framework we develop an algorithm for optimising density forecasting models and implement this algorithm for GARCH (Bollerslev, 1986) and GJR models (Glosten et al., 1993). We call these new models Pareto-GARCH and Pareto-GJR. To determine whether this approach of multi-objective optimisation of density forecasting models produces better results over the standard GARCH and GJR optimisation techniques we compare the models produced empirically on a Value-at-Risk application. Our evaluation shows that our Pareto models produce superior results out-of-sample.