{"title":"极值理论中阈值估计的规范检验","authors":"L. C. Miranda","doi":"10.21314/JOP.2014.145","DOIUrl":null,"url":null,"abstract":"A fundamental component in the modeling of a financial risk exposure is the estimation of the probability distribution function that best describes the true data-generation process of independent and extreme loss events that fall above a certain threshold. In this paper, we assume that, above the threshold, the extreme loss events are explained by an extreme value distribution. For that purpose, we apply the classical peaks-over-threshold method in extreme-value statistics. According to that approach, data in excess of a certain threshold is asymptotically described by a generalized Pareto distribution (GPD). Consequently, establishing a mechanism to estimate this threshold is of major importance. The current methods to estimate the thresholds are based on a subjective inspection of mean excess plots or other statistical measures; the Hill estimator, for example, leads to an undesirable level of subjectivity. In this paper, we propose an innovative mechanism that increases the level of objectivity of threshold selection, departing from a subjective and imprecise eyeballing of charts. The proposed algorithm is based on the properties of the generalized Pareto distribution and considers the choice of threshold to be an important modeling decision that can have significant impact on the model outcomes. The algorithm we introduce here is based on the Hausman specification test to determine the threshold, which maintains proper specification so that the other parameters of the distribution can be estimated without compromising the balance between bias and variance. We apply the test to real risk data so that we can obtain a practical example of the improvements the process will bring. Results show that the Hausman test is a valid mechanism for estimating the GPD threshold and can be seen as a relevant enhancement in the objectivity of the entire process.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"35 1","pages":"23-37"},"PeriodicalIF":0.4000,"publicationDate":"2014-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Specification Test for Threshold Estimation in Extreme Value Theory\",\"authors\":\"L. C. Miranda\",\"doi\":\"10.21314/JOP.2014.145\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A fundamental component in the modeling of a financial risk exposure is the estimation of the probability distribution function that best describes the true data-generation process of independent and extreme loss events that fall above a certain threshold. In this paper, we assume that, above the threshold, the extreme loss events are explained by an extreme value distribution. For that purpose, we apply the classical peaks-over-threshold method in extreme-value statistics. According to that approach, data in excess of a certain threshold is asymptotically described by a generalized Pareto distribution (GPD). Consequently, establishing a mechanism to estimate this threshold is of major importance. The current methods to estimate the thresholds are based on a subjective inspection of mean excess plots or other statistical measures; the Hill estimator, for example, leads to an undesirable level of subjectivity. In this paper, we propose an innovative mechanism that increases the level of objectivity of threshold selection, departing from a subjective and imprecise eyeballing of charts. The proposed algorithm is based on the properties of the generalized Pareto distribution and considers the choice of threshold to be an important modeling decision that can have significant impact on the model outcomes. The algorithm we introduce here is based on the Hausman specification test to determine the threshold, which maintains proper specification so that the other parameters of the distribution can be estimated without compromising the balance between bias and variance. We apply the test to real risk data so that we can obtain a practical example of the improvements the process will bring. Results show that the Hausman test is a valid mechanism for estimating the GPD threshold and can be seen as a relevant enhancement in the objectivity of the entire process.\",\"PeriodicalId\":54030,\"journal\":{\"name\":\"Journal of Operational Risk\",\"volume\":\"35 1\",\"pages\":\"23-37\"},\"PeriodicalIF\":0.4000,\"publicationDate\":\"2014-06-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Operational Risk\",\"FirstCategoryId\":\"96\",\"ListUrlMain\":\"https://doi.org/10.21314/JOP.2014.145\",\"RegionNum\":4,\"RegionCategory\":\"经济学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"BUSINESS, FINANCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Operational Risk","FirstCategoryId":"96","ListUrlMain":"https://doi.org/10.21314/JOP.2014.145","RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"BUSINESS, FINANCE","Score":null,"Total":0}
Specification Test for Threshold Estimation in Extreme Value Theory
A fundamental component in the modeling of a financial risk exposure is the estimation of the probability distribution function that best describes the true data-generation process of independent and extreme loss events that fall above a certain threshold. In this paper, we assume that, above the threshold, the extreme loss events are explained by an extreme value distribution. For that purpose, we apply the classical peaks-over-threshold method in extreme-value statistics. According to that approach, data in excess of a certain threshold is asymptotically described by a generalized Pareto distribution (GPD). Consequently, establishing a mechanism to estimate this threshold is of major importance. The current methods to estimate the thresholds are based on a subjective inspection of mean excess plots or other statistical measures; the Hill estimator, for example, leads to an undesirable level of subjectivity. In this paper, we propose an innovative mechanism that increases the level of objectivity of threshold selection, departing from a subjective and imprecise eyeballing of charts. The proposed algorithm is based on the properties of the generalized Pareto distribution and considers the choice of threshold to be an important modeling decision that can have significant impact on the model outcomes. The algorithm we introduce here is based on the Hausman specification test to determine the threshold, which maintains proper specification so that the other parameters of the distribution can be estimated without compromising the balance between bias and variance. We apply the test to real risk data so that we can obtain a practical example of the improvements the process will bring. Results show that the Hausman test is a valid mechanism for estimating the GPD threshold and can be seen as a relevant enhancement in the objectivity of the entire process.
期刊介绍:
In December 2017, the Basel Committee published the final version of its standardized measurement approach (SMA) methodology, which will replace the approaches set out in Basel II (ie, the simpler standardized approaches and advanced measurement approach (AMA) that allowed use of internal models) from January 1, 2022. Independently of the Basel III rules, in order to manage and mitigate risks, they still need to be measurable by anyone. The operational risk industry needs to keep that in mind. While the purpose of the now defunct AMA was to find out the level of regulatory capital to protect a firm against operational risks, we still can – and should – use models to estimate operational risk economic capital. Without these, the task of managing and mitigating capital would be incredibly difficult. These internal models are now unshackled from regulatory requirements and can be optimized for managing the daily risks to which financial institutions are exposed. In addition, operational risk models can and should be used for stress tests and Comprehensive Capital Analysis and Review (CCAR). The Journal of Operational Risk also welcomes papers on nonfinancial risks as well as topics including, but not limited to, the following. The modeling and management of operational risk. Recent advances in techniques used to model operational risk, eg, copulas, correlation, aggregate loss distributions, Bayesian methods and extreme value theory. The pricing and hedging of operational risk and/or any risk transfer techniques. Data modeling external loss data, business control factors and scenario analysis. Models used to aggregate different types of data. Causal models that link key risk indicators and macroeconomic factors to operational losses. Regulatory issues, such as Basel II or any other local regulatory issue. Enterprise risk management. Cyber risk. Big data.