YorkSpace has migrated to a new version of its software. Access our Help Resources to learn how to use the refreshed site. Contact diginit@yorku.ca if you have any questions about the migration.
 

A Dependence Analysis Within the Context of Risk Allocations: Distributions on the Simplex and the Notion of Counter-Monotonicity

Loading...
Thumbnail Image

Date

2023-08-04

Authors

Mohammed, Nawaf Mahmood Abdullah

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

The remarkable development of today's financial and insurance products demands sound methodologies for the accumulation and characterization of intertwined risks. As a result, modern risk management emerges as a by product querying two key foundations. The first is concerned with the aggregation of said risks into one randomness which is consequently easily measured by a convenient risk measure and thereafter reported. The pooling is done from the different business units (BUs) composing the financial entity. The second pillar pertains to the opposite direction which concerns itself with the allocation of the total risk. It seeks to accurately and concretely attribute the riskiness of each individual BU with respect to the whole.

The aggregation process, on one hand, has been fairly well studied in the literature, implemented in the industry and even embedded into the different accords. Risk capital allocation, on the other, is generally much more involved even when a specific risk measure inducing the allocation rule is assumed, let alone the case when a class of risk measures is considered. And unlike the aggregation exercise, which is moderately determined by the collection function, attributing capital is often more heavily influenced by the dependencies among the different BUs.

In the literature, nonetheless, allocating capital can be categorized into two main camps. One is built upon the pretense that the distribution of risk should satisfy certain regulatory requirements. This leads to an axiomatic approach which is quite often mathematically tractable yet ignores the economic incentives of the market. The other school of thought is economically driven, allocating risk based on a profit-maximizing paradigm. It argues that capital allocation should reflect the risk perception of the institution and not be imposed by any arbitrary measure, for which its selection is dubious at best. However, the economic approach suffers from complex relations that lack clear definitive forms.

At first glance the two perspectives may seem distant, as they arise naturally in their own contexts and are justified accordingly. Nonetheless, they can coincide for particular losses that enjoy certain peculiar model settings which are described thoroughly in the chapters thereafter. Surprisingly, the reconciliation comes in connection with the concept of trivial allocations. Triviality, in itself, attracts practitioners as it requires no discernible dependencies leading to a convenient yet faulty method of attributing risk. Regardless, when used in the right context it unveils surprising connections and conveys useful conclusions. The intersection of the regulatory and profit-maximizing principles, for example, mainly utilizes a milder version of triviality (proportional) which allows for distinct, albeit few, probabilistic laws that accommodate both theories. Furthermore, when a stronger triviality (absolute) condition is imposed, it yields another intriguing corollary, specifically that of restrictive extreme laws commonly known for antithetic or counter-monotonic variates.

To address the framework hitherto introduced, in the first chapter of this dissertation, we present a general class of weighted pricing functionals. This wide class covers most of the risk measures and allocations found in the literature today and adequately represents their various properties. We begin by investigating the order characteristics of the functionals under certain sufficient conditions. The results reveal interactive relationships between the weight and the aggregation make-up of the measures, which consequently, allow for effective comparison between the different risks. Then upon imposing restrictions on the allocation constituents, we establish equivalent statements for trivial allocations that uncover a novel general concept of counter-monotonicity. More significantly, similar equivalences are obtained for a weaker triviality notion that pave the path to answer the aforementioned question of allocation reconciliation.

The class of weighted functionals, though constructive, is too general to apply effectively to the allocation theories. Thus, in the second chapter, we consider the special case of conditional tail expectation (CTE), defining its risk measure and the allocation it induces. These represent the regulatory approach to allocation as CTE is arguably one of the most prominent and front-runner measures used and studied today. On the other side, we consider the allocation arising from the economic context that aims to maximize profit subject to other market forces as well as individual perceptions. Both allocations are taken as proportions as they are formed from compositional maps which relate to the standard simplex in either a stochastic or non-stochastic manner. Then we equate the two allocations and derive a general description for the laws that satisfy the two functionals. The Laplace transform of the multivariate size bias is used as the prime identifier delineating the general distributions and detailing subsequent corollaries and examples.

While studying the triviality nature of allocations, we focused on the central element of stochastic dependence. We showed how certain models, extremal dependence for instance, enormously influences the attribution outcome. Thus far, nonetheless, our query started from the point of allocation relations, be it proportional or absolute, then ended in law characterizations that satisfy those relations. Equally important, on the other hand, is deriving allocations expressions based on a priori assumed models. This task requires apt choices of general structures which convey the desired probabilistic nature of losses. Since constructing joint laws can be quite challenging, the compendium of probabilistic models relies heavily on leveraging the stochastic representations of known distributions. This feat allows not only for simpler computations but as well for useful interpretations. Basic mathematical operations are usually deployed to derive different joint distributions with certain desirable properties. For example, taking the minimum yields the Marshall-Olkin distribution, addition gives the additive background model and multiplication/division naturally leads to the multiplicative background model. Simultaneously, univariate manipulation through location, scale and power transforms adds to the flexibility of the margins while preserving the overall copula. In the last chapter of this dissertation, we introduce a composite of the Marshall-Olkin, additive and multiplicative models to obtain a novel multivariate Pareto-Dirichlet law possessing a profound composition capable of modelling heavy tailed events descriptor of many extremal scenarios in insurance and finance. We study its survival function and the corresponding moments and mixed moments. Then we focus on the bivariate case, detailing the intricacies of its inherent expressions. And finally, we conclude with a thorough application to the risk and allocation functionals respectively.

Description

Keywords

Applied mathematics, Mathematics

Citation