Ivan Luciano Danesi, Fabio Piacenza, E. Ruli, L. Ventura
{"title":"Optimal B-Robust Posterior Distributions for Operational Risk","authors":"Ivan Luciano Danesi, Fabio Piacenza, E. Ruli, L. Ventura","doi":"10.21314/jop.2016.182","DOIUrl":"https://doi.org/10.21314/jop.2016.182","url":null,"abstract":"One of the aims of operational risk modelling is to generate sound and reliable quantifications of the risk exposure, including a level of volatility that is consistent with the changes of the risk profile. One way for assuring this is by means of robust procedures, such as Optimal B-Robust estimating equations. In banking practice more than one dataset should be incorporated in the risk modelling and a coherent way to proceed to such a data integration is via Bayesian procedures. However, Bayesian inference via estimating equations in general is problematic since the likelihood function is not available. We illustrate that this issue can be dealt with using approximate Bayesian computation methods with the robust estimating function as a summary of the data. The method is illustrated by a real dataset.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"34 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2016-04-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89607952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Riaan de Jongh, T. de Wet, K. Panman, H. Raubenheimer
{"title":"A Simulation Comparison of Quantile Approximation Techniques for Compound Distributions Popular in Operational Risk","authors":"Riaan de Jongh, T. de Wet, K. Panman, H. Raubenheimer","doi":"10.21314/JOP.2016.171","DOIUrl":"https://doi.org/10.21314/JOP.2016.171","url":null,"abstract":"Many banks currently use the loss distribution approach (LDA) for estimating economic and regulatory capital for operational risk under Basel's advanced measurement approach. The LDA requires the modeling of the aggregate loss distribution in each operational risk category (ORC), among others. The aggregate loss distribution is a compound distribution resulting from a random sum of losses, where the losses are distributed according to some severity distribution, and the number (of losses) is distributed according to some frequency distribution. In order to estimate the economic or regulatory capital in a particular ORC, an extreme quantile of the aggregate loss distribution has to be estimated from the fitted severity and frequency distributions. Since a closed-form expression for the quantiles of the resulting estimated compound distribution does not exist, the quantile is usually approximated using a brute force Monte Carlo simulation, which is computationally intensive. However, a number of numerical approximation techniques have been proposed to lessen the computational burden. Such techniques include Panjer recursion, the fast Fourier transform and different orders of both the single-loss approximation and perturbative approximation. The objective of this paper is to compare these methods in terms of their practical usefulness and potential applicability in an operational risk context. We find that the second-order perturbative approximation, a closed-form approximation, performs very well at the extreme quantiles and over a wide range of distributions, and it is very easy to implement. This approximation can then be used as an input to the recursive fast Fourier algorithm to gain further improvements at the less extreme quantiles.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"15 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2016-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"86145191","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Application of the Convolution Operator for Scenario Integration with Loss Data in Operational Risk Modeling","authors":"Pavan Aroda, A. Guergachi, Huaxiong Huang","doi":"10.21314/jop.2015.168","DOIUrl":"https://doi.org/10.21314/jop.2015.168","url":null,"abstract":"When using the advanced measurement approach to determine required regulatory capital for operational risk, expert opinion is applied via scenario analysis to help quantify exposure to high-severity events. A methodology is presented that makes use of the convolution operator to integrate scenarios into a baseline model. Using a baseline loss distribution model calibrated on historical losses and a scenario-derived loss distribution calibrated on scenario data points, the addition of both random processes equates to the convolution of the corresponding densities. Using an analogy from digital signal processing, the commutative property of convolution allows one function to smooth and average the other. The inherent uncertainty in scenario analysis has caused concern amongst practitioners when too much emphasis has been placed on absolutes in terms of quantified frequency/severity estimates. This method addresses this uncertainty and produces a combined loss distribution that takes information from the entire domain of the calibrated scenario distribution. The necessary theory is provided within and an example is shown to provide context.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"22 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"76952795","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mitigating Rogue-Trading Behavior by Means of Appropriate, Effective Operational Risk Management","authors":"S. Rick, Gerrit Jan van den Brink","doi":"10.21314/JOP.2015.162","DOIUrl":"https://doi.org/10.21314/JOP.2015.162","url":null,"abstract":"This paper discusses the violation of applicable firm guidelines by individuals employed by a bank or financial institution and suggests specific metrics to identify and prevent such behavior by means of appropriate, effective operational risk management. Since the actor is usually socially inconspicuous, and since the associated financial damage does not necessarily have to be verifiable through classic valuation methods (e.g. financial statements), we feel that it is very difficult for banks and financial institutions to uncover such behavior. Nevertheless, in order to be able to react to this latent risk, we apply modern, basic criminological assumptions to analyse the relationship between the multiple causes of the risk and their effects in the underlying risk origination process. The analysis is performed based on Schneider's model, which is used to describe the criminal behavior of socially inconspicuous individuals. Based on the result of that analysis, we design a specific conceptual risk indicator that approximates the underlying risk exposure by means of a linear function. We then operate the developed risk indicators through a dashboard, tracking the development of each valid indicator value through time. The effectiveness of the measures taken to counteract the risk can be derived from the development of the displayed indicator value and the related trend.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"53 ","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"72420150","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Bayesian Operational Risk Models","authors":"Silvia Figini, Lijun Gao, Paolo Giudici","doi":"10.21314/JOP.2015.155","DOIUrl":"https://doi.org/10.21314/JOP.2015.155","url":null,"abstract":"Operational risk is hard to quantify, for the presence of heavy tailed loss distributions. Extreme value distributions, used in this context, are very sensitive to the data, and this is a problem in the presence of rare loss data. Self risk assessment questionnaires, if properly modelled, may provide the missing piece of information that is necessary to adequately estimate op- erational risks. In this paper we propose to embody self risk assessment data into suitable prior distributions, and to follow a Bayesian approach to merge self assessment with loss data. We derive operational loss posterior distribu- tions, from which appropriate measures of risk, such as the Value at Risk, or the Expected Shortfall, can be derived. We test our proposed models on a real database, made up of internal loss data and self risk assessment questionnaires of an anonymous commercial bank. Our results show that the proposed Bayesian models performs better with respect to classical extreme value models, leading to a smaller quantification of the Value at Risk required to cover unexpected losses.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"38 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-06-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"87460323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Shapley Allocation, Diversification and Services in Operational Risk","authors":"P. Mitic, Bertrand K. Hassani","doi":"10.21314/jop.2018.205","DOIUrl":"https://doi.org/10.21314/jop.2018.205","url":null,"abstract":"A method of allocating Operational Risk regulatory capital using the Shapley method for a large number of business units, supported by a service, is proposed. A closed-form formula for Shapley allocations is developed under two principal assumptions. First, if business units form coalitions, the value added to the coalition by a new entrant depends on a constant proportionality factor. This factor represents the diversification that can be achieved by combining operational risk losses. Second, that the service should reduce the capital payable by business units, and that this reduction is calculated as an integral part of the allocation process. We ensure that allocations of capital charges are acceptable to and are understandable by both risk and senior managers. The results derived are applied to recent loss data.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"129 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"73069919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improved Goodness-of-Fit Measures","authors":"P. Mitic","doi":"10.21314/JOP.2015.159","DOIUrl":"https://doi.org/10.21314/JOP.2015.159","url":null,"abstract":"New goodness-of-fit measures which are significant improvements on existing measures are described. They use the intuitive geometrical concept of the area enclosed by the curve of a fitted distribution and the profile of the empirical cumulative distribution function.A transformation of this profile simplifies the geometry and provides three new goodness-of-fit tests. The integrity of this transformation is justified by topological arguments. The new tests provide a quantitative justification for qualitative judgements on goodness-of-fit, are independent of population size and provide a workable way to objectively choose a best fit distribution from a group of candidate distributions.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"45 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2015-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"75518679","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Checklist-Based Weighted Fuzzy Severity Approach for Calculating Operational Risk Exposure on Foreign Exchange Trades Under the Basel II Regime","authors":"V. Sree Hari Rao, K. Ramesh","doi":"10.21314/jop.2014.136","DOIUrl":"https://doi.org/10.21314/jop.2014.136","url":null,"abstract":"It is well-known that any risk management activity is a cost to the organization. However, optimized risk management practices satisfy regulatory capital requirements and gain the confidence of investors who take calculated risks. A bank’s risk management division will generate a profit if it can develop methodologies to decrease the nonworking regulatory capital. This may be achieved only when the risks are measured using data from internal and external sources in conjunction with scenario analysis. One such method of measuring operational risk (OR) is the advanced measurement approach. This involves quantifying ORs across the various nodes within a bank following the loss distribution approach, in which the frequency and severity distributions of the loss-generating OR events are estimated from the data sources. These distributions are then used to generate the scenarios for frequency and its associated severity for estimating the OR capital. In our approach, the various levels of loss severity are mapped to a percentage of total trade exposure, and the occurrence frequency of an OR event is assumed to follow a binomial distribution.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"15 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2014-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"91019063","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dissecting the JPMorgan Whale: A Post-Mortem","authors":"P. Mcconnell","doi":"10.21314/JOP.2014.144","DOIUrl":"https://doi.org/10.21314/JOP.2014.144","url":null,"abstract":"In many respects, the “London whale” scandal at JPMorgan Chase is similar to other “rogue trading” events, in that a group of traders took large, speculative positions in complex derivative securities that went wrong, resulting in over US$6 billion of trading losses to the firm. As in other rogue trading cases, there were desperate attempts to cover up the losses until they became too big to ignore and eventually had to be recognized in the financial accounts of the bank. However, the whale case, so-called because of the sheer size of the trading positions involved, differs in several important respects from other rogue trading cases, not least because the sheer size and riskiness of the positions were well-known to many executives within JPMorgan, a firm that prided itself on having advanced risk management capabilities and systems. The role of Model Risk in this scandal, while not the primary cause, is important in that at least part of the impetus to take huge positions was due to incorrect risk modeling. Various external and internal inquiries into the events have concluded that critical risk management processes in the bank broke down, not only in the Chief Investment Office, the division in which the losses occurred, but across the bank. In particular, deficiencies in the firm’s Model Development and Approval processes allowed traders to trade while underestimating the risks that they were running. Under Basel II regulations, losses due to process failure are classified as operational risk losses and hence this case demonstrates a significant failure of operational risk management in JPMorgan. This paper dissects the whale scandal from an operational risk perspective using the late Professor Barry Turner’s framework for analyzing organizational disasters. The paper also makes suggestions as to how model risk may be managed to prevent similar losses in future.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"61 1","pages":""},"PeriodicalIF":0.5,"publicationDate":"2014-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"74407194","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Specification Test for Threshold Estimation in Extreme Value Theory","authors":"L. C. Miranda","doi":"10.21314/JOP.2014.145","DOIUrl":"https://doi.org/10.21314/JOP.2014.145","url":null,"abstract":"A fundamental component in the modeling of a financial risk exposure is the estimation of the probability distribution function that best describes the true data-generation process of independent and extreme loss events that fall above a certain threshold. In this paper, we assume that, above the threshold, the extreme loss events are explained by an extreme value distribution. For that purpose, we apply the classical peaks-over-threshold method in extreme-value statistics. According to that approach, data in excess of a certain threshold is asymptotically described by a generalized Pareto distribution (GPD). Consequently, establishing a mechanism to estimate this threshold is of major importance. The current methods to estimate the thresholds are based on a subjective inspection of mean excess plots or other statistical measures; the Hill estimator, for example, leads to an undesirable level of subjectivity. In this paper, we propose an innovative mechanism that increases the level of objectivity of threshold selection, departing from a subjective and imprecise eyeballing of charts. The proposed algorithm is based on the properties of the generalized Pareto distribution and considers the choice of threshold to be an important modeling decision that can have significant impact on the model outcomes. The algorithm we introduce here is based on the Hausman specification test to determine the threshold, which maintains proper specification so that the other parameters of the distribution can be estimated without compromising the balance between bias and variance. We apply the test to real risk data so that we can obtain a practical example of the improvements the process will bring. Results show that the Hausman test is a valid mechanism for estimating the GPD threshold and can be seen as a relevant enhancement in the objectivity of the entire process.","PeriodicalId":54030,"journal":{"name":"Journal of Operational Risk","volume":"35 1","pages":"23-37"},"PeriodicalIF":0.5,"publicationDate":"2014-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89028079","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"经济学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}