Algorithmic differentiation: The silver bullet to overcome the post Solvency II challenge

Algorithmic differentiation: The silver bullet to overcome the post Solvency II challenge

Added in AFIR / ERM / RISK

Thanks! Share it with your friends!


You disliked this video. Thanks for the feedback!

Sorry, only registred users can create playlists.


Speaker(s): Sven Ludwig (FIS Systeme GmbH)

With the completion of Solvency II projects, the next challenge enters the center stage. How to manage the company in the new regime? How to steer the Market Consistent Embedded Value, how to manage the capital?
MCEV, EEV, SCR are key measures which depends on thousands of input parameters or variables. The sensitivity of these measures against their input parameters need to be assessed and understood to manage the insurance. A common mean to understand those sensitivities are the calculation of selected stress tests which combines the modifications of sets of the input parameters. However, a single stress test requires the same amount of calculations and most important time as the initial base calculation e.g. MCEV. Hence only a limited number of sensitivities can be determined. Not knowing the key drivers of the essential measures is a key constraint for management.
Adjoined Algorithmic Differentiation (AAD) is a programming technique which removes this constrained. Within just five times of time required for base calculation AAD provides sensitivities to all (i.e. thousands of) input parameters. These sensitivities are not approximations, these sensitivities are the mathematical the exact values of the partial derivatives of the target function. AAD does not require the actuary to determine the partial derivative, it is a result of the programing technique, which also allows to overcome the challenge of discontinuities.
The removal of the constraints by AAD sounds like black magic. In fact, is a smart way of decomposing calculations and applying the multivariate chain rule. The presentation provides an explanation of two algorithmic differentiation approaches: AAD and TAD (Tangent Algorithmic Differentiation). It is illustrated how those approaches work and how they differ from each other. Furthermore, challenges in practice are outlined and provides indications how these are solved.

Post your comment

Sign in or sign up to post comments.
Be the first to comment