使密度预测模型在统计上一致

Michael Carney, P. Cunningham, B. Lucey
{"title":"使密度预测模型在统计上一致","authors":"Michael Carney, P. Cunningham, B. Lucey","doi":"10.2139/ssrn.877629","DOIUrl":null,"url":null,"abstract":"We propose a new approach to density forecast optimisation and apply it to Value-at-Risk estimation. All existing density forecasting models try to optimise the distribution of the returns based solely on the predicted density at the observation. In this paper we argue that probabilistic predictions should be optimised on more than just this accuracy score and suggest that the statistical consistency of the probability estimates should also be optimised during training. Statistical consistency refers to the property that if a predicted density function suggests P percent probability of occurrence, the event truly ought to have probability P of occurring. We describe a quality score that can rank probability density forecasts in terms of statistical consistency based on the probability integral transform (Diebold et al., 1998b). We then describe a framework that can optimise any density forecasting model in terms of any set of objective functions. The framework uses a multi-objective evolutionary algorithm to determine a set of trade-off solutions known as the Pareto front of optimal solutions. Using this framework we develop an algorithm for optimising density forecasting models and implement this algorithm for GARCH (Bollerslev, 1986) and GJR models (Glosten et al., 1993). We call these new models Pareto-GARCH and Pareto-GJR. To determine whether this approach of multi-objective optimisation of density forecasting models produces better results over the standard GARCH and GJR optimisation techniques we compare the models produced empirically on a Value-at-Risk application. Our evaluation shows that our Pareto models produce superior results out-of-sample.","PeriodicalId":149679,"journal":{"name":"Frontiers in Finance & Economics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2006-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Making Density Forecasting Models Statistically Consistent\",\"authors\":\"Michael Carney, P. Cunningham, B. Lucey\",\"doi\":\"10.2139/ssrn.877629\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose a new approach to density forecast optimisation and apply it to Value-at-Risk estimation. All existing density forecasting models try to optimise the distribution of the returns based solely on the predicted density at the observation. In this paper we argue that probabilistic predictions should be optimised on more than just this accuracy score and suggest that the statistical consistency of the probability estimates should also be optimised during training. Statistical consistency refers to the property that if a predicted density function suggests P percent probability of occurrence, the event truly ought to have probability P of occurring. We describe a quality score that can rank probability density forecasts in terms of statistical consistency based on the probability integral transform (Diebold et al., 1998b). We then describe a framework that can optimise any density forecasting model in terms of any set of objective functions. The framework uses a multi-objective evolutionary algorithm to determine a set of trade-off solutions known as the Pareto front of optimal solutions. Using this framework we develop an algorithm for optimising density forecasting models and implement this algorithm for GARCH (Bollerslev, 1986) and GJR models (Glosten et al., 1993). We call these new models Pareto-GARCH and Pareto-GJR. To determine whether this approach of multi-objective optimisation of density forecasting models produces better results over the standard GARCH and GJR optimisation techniques we compare the models produced empirically on a Value-at-Risk application. Our evaluation shows that our Pareto models produce superior results out-of-sample.\",\"PeriodicalId\":149679,\"journal\":{\"name\":\"Frontiers in Finance & Economics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2006-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Finance & Economics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2139/ssrn.877629\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Finance & Economics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2139/ssrn.877629","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

我们提出了一种新的密度预测优化方法,并将其应用于风险价值估计。所有现有的密度预测模型都试图仅根据观测点的预测密度来优化收益的分布。在本文中,我们认为概率预测应该优化的不仅仅是这个准确性分数,并建议概率估计的统计一致性也应该在训练期间优化。统计一致性指的是这样一种性质:如果一个预测的密度函数表明发生的概率为P %,那么该事件确实应该具有发生的概率P。我们描述了一个质量分数,它可以根据概率积分变换的统计一致性对概率密度预测进行排序(Diebold et al., 1998b)。然后,我们描述了一个框架,该框架可以根据任何一组目标函数优化任何密度预测模型。该框架使用多目标进化算法来确定一组权衡解,称为最优解的帕累托前沿。利用这一框架,我们开发了一种优化密度预测模型的算法,并将该算法应用于GARCH (Bollerslev, 1986)和GJR模型(Glosten et al., 1993)。我们称这些新模型为帕累托- garch和帕累托- gjr。为了确定这种密度预测模型的多目标优化方法是否比标准GARCH和GJR优化技术产生更好的结果,我们比较了在风险价值应用上产生的模型。我们的评估表明,我们的帕累托模型产生了更好的样本外结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Making Density Forecasting Models Statistically Consistent
We propose a new approach to density forecast optimisation and apply it to Value-at-Risk estimation. All existing density forecasting models try to optimise the distribution of the returns based solely on the predicted density at the observation. In this paper we argue that probabilistic predictions should be optimised on more than just this accuracy score and suggest that the statistical consistency of the probability estimates should also be optimised during training. Statistical consistency refers to the property that if a predicted density function suggests P percent probability of occurrence, the event truly ought to have probability P of occurring. We describe a quality score that can rank probability density forecasts in terms of statistical consistency based on the probability integral transform (Diebold et al., 1998b). We then describe a framework that can optimise any density forecasting model in terms of any set of objective functions. The framework uses a multi-objective evolutionary algorithm to determine a set of trade-off solutions known as the Pareto front of optimal solutions. Using this framework we develop an algorithm for optimising density forecasting models and implement this algorithm for GARCH (Bollerslev, 1986) and GJR models (Glosten et al., 1993). We call these new models Pareto-GARCH and Pareto-GJR. To determine whether this approach of multi-objective optimisation of density forecasting models produces better results over the standard GARCH and GJR optimisation techniques we compare the models produced empirically on a Value-at-Risk application. Our evaluation shows that our Pareto models produce superior results out-of-sample.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信