Economic Capital Analysis with Portfolios of Dependent and Heavy-Tailed Risks

Loading...
Thumbnail Image

Date

2020-05-11

Authors

Kye, Yisub

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

In the nowadays reality of prudent risk management, the problem of determining aggregate risk capital in financial entities has been intensively studied for quite long. As a result, canonical methods have been developed and even embedded in regulatory accords. While applauded by some and questioned by others, these methods provide a much desired standard benchmark for everyone. The situation is very different when the aggregate risk capital needs to be allocated to the business units of a financial entity. That is, there are overwhelmingly many ways to conduct the allocation exercise, and there is arguably no standard method to do so on the horizon.

Two overarching approaches to allocate the aggregate risk capital stand out. These are the top-down approach that entails that the allocation exercise is imposed by the corporate centre, and the bottom-up approach that implies that the allocation of the aggregate risk to business units is informed by these units. Briefly, the top-down allocations start with the aggregate risk capital that is then replenished among business units according to the views of the centre, thus limiting the inputs from the business units. The bottom-up approach does start with the business units, but it is, as a rule, too granular, and so may lead to missing the wood for the trees.

The first chapter of this dissertation is concerned with the bottom-up approach to allocating the aggregate risk capital. Namely, we put forward a general theoretical framework for the multiplicative background risk model that allows for arbitrarily distributed idiosyncratic and systemic risk factors. We reveal links between the just-mentioned general structure and the one with the exponentially distributed idiosyncratic risk factors (a key player in the modern actuarial modelling), study relevant theoretical properties of the new structure, and discuss important special cases. Also, we construct realistic numerical examples borrowed from the context of the determination and allocation of economic capital. The examples suggest that a little departure from exponentiality can have substantial impacts on the outcome of risk analysis.

In the second chapter of this dissertation, we question the way in which the risk allocation practice is conducted in the state of the art and present an alternative that comes from the context of the distributions defined on the multidimensional simplex. More specifically, we put forward a new family of mixed-scaled Dirichlet distributions that contain the classical Dirichlet distribution as a special case, exhibit a multitude of desirable closure properties, and emerge naturally within the multivariate risk analysis context. As a by-product, our invention revisits the proportional allocation rule that is often used in applications. Interestingly, we are able to unify the top-down and the bottom-up approaches to allocating the aggregate risk capital into one encompassing method.

During the study underlying the present dissertation, we rediscovered certain problems of the standard deviation as the ubiquitous measure of variability. In particular, the standard deviation is frequently infinite for insurance risks in the Property and Casualty lines of business, and so it cannot be used to quantify variability therein. Also, the standard deviation is a questionable measure of variability when non-normal distributions are considered, and normality is rarely a reasonable assumption in insurance practice. Therefore, in the third chapter of this dissertation, we turn to an alternative measure of variability. The Gini Mean Difference, which we study in the third chapter, is finite whenever the mean is so, and it is suitable for measuring variability for non-normal risks. Nevertheless, Gini Mean Difference is by far less common in actuarial science than the standard deviation. One of the main reasons for this lies in the critics associated with the computability of the Gini. We reveal convenient ways to compute the Gini Mean Difference measure of variability explicitly and often effortlessly. The thrust of our approach is a link, which we discover, between the Gini and the notion of statistical sample size-bias. Not only the just-mentioned link opens up advantageous computational routes for Gini, but also yields an alternative interpretation for it.

Description

Keywords

Economics

Citation