[List of Papers] [Forthcoming Papers] [Software] [Papers and Abstracts]
If you would like to get an email when new preprints are added, please send your name and email address to Ms. Aline Strolz, email strolz@isb.unizh.ch. If you want to see some of the papers sorted by RiskLab research project, visit the Projects Web Page. If you are looking for something specific, please use your browser's find/search command.

top 
Risk Theory in a
MarkovDependent Stochastic Economic Environment 

Authors:  Prof. Dr.
Jeffrey F. Collamore (University of Copenhagen,
former member of RiskLab) Dr. Harri Nyrhinen (University of Helsinki) 
Measuring OpRisk by the
SmallTime Ruin Probability: A VolumeBased Model 

Authors:  Prof. Dr.
Jeffrey F. Collamore (University of Copenhagen,
former member of RiskLab) Andrea Höing (Department of Mathematics,ETH Zürich) 
Estimation of the Stylized
Facts of a Stochastic Cascade Model 

Authors:  Dr. Céline Azizieh
(Département
de Mathématique, Universite
Libre de Bruxelles, former Visiting postdoctoral research fellow at RiskLab) PD Dr. Wolfgang Breymann (RiskLab) 
Project:  Volatility Estimation and Risk Measurement: From Short to LongTime Horizons 
A Realistic Heterogeneous
MultiAgent Model for the FX Market 

Authors:  PD Dr.
Wolfgang Breymann (RiskLab) Christoph M. Schmid (Department of Mathematical Statistics and Actuarial Science, University of Bern) 
Project:  Volatility Estimation and Risk Measurement: From Short to LongTime Horizons 

top 
HfFinance  An SPLUS Tool for Deseasonalizing HighFrequency Financial Data  
Author:  PD Dr. Wolfgang Breymann (RiskLab)  
Abstract:  The package provides functionality for analyzing the seasonality of high frequency financial time series and deseasonalizing them either through a time transformation or through a volatility weighting of the returns. It is written for SPLUS and heavily relies on the "timeSeries" classes but it doesn't require the S+FinMetrics package. For a technical description of the method please refer to W. Breymann (2000) and W. Breymann et al. (2003).  
Remark:  The present version is in a beta state. The author would be grateful for any bug reports, suggestions and comments.  
License:  The package is published under the GNU General Public License (GPL).  
Project:  Volatility Estimation and Risk Measurement: From Short to LongTime Horizons  
Download:  http://www.math.ethz.ch/~breymann/tools.html  

Representations of the First Hitting Time Density of an OrnsteinUhlenbeck Process  
Author:  Dr. Larbi Alili
(Department of Mathematics, ETH Zürich) Pierre Patie (RiskLab) Dr. Jesper L. Pedersen (Statistics Division, University of Copenhagen) 

French title:  Un algorithme d'optimisation pour la prédiction la volatilité des cours des changes avec le Modèle en Cascade Stochastique (MCS)  
Abstract:  Different expressions are given for the first hitting time density of an OrnsteinUhlenbeck process to a fixed level. This density is closely related to the Laplace transform functional of a three dimensional Bessel bridge. The first espresssion hangs on an eigenvalue expansions involving zeros of the cylinder parabolic functions. The second one is an integral representation involving some new special functions.  
Date:  2005  
Keywords and Phrases:  OrnsteinUhlenbeck process, hitting time density, Laplace transform, Fourier transform, Bessel bridge.  
2000 Mathematics Subject Classification  60J60 and 60E10.  
Size:  11 pages  
Status: 
To appear in Stochastic Models, Issue 21.4, 2005  
Textfiles: 
PDF, PS 


Time Scaling for GARCH(1,1)
and AR(1)GARCH(1,1) Processes 

Authors:  Raymond
Brummelhuis (University of London) Roger Kaufmann (Swiss Life, former member of RiskLab) 

Abstract:  This paper investigates the estimation of a 10day
valueatrisk based on a data set of 250 daily values. The commonly
used squarerootoftime rule, which scales the 1day 99% valueatrisk with a factor p10, is compared with alternative 10day estimators in the case of random walks, GARCH(1,1) and AR(1)GARCH(1,1) processes. Additionally, some theoretical results on Nday valueatrisk in such models are presented. The overall conclusion is that, although not perfect, the p10rule performs remarkably well. 

Keywords:  Valueatrisk, scaling rules, random walk, GARCH(1,1) process, AR(1)GARCH(1,1) process.  
Date:  November, 2004  
Type:  Submitted Preprint  

Size:  48 pages  
Textfiles:  Postscript
(3.6 MBytes, US letter format) Portable Document Format (695KBytes, US letter format) 


Default Risk for Residential Mortgage Portfolios  
Authors:  Enrico
De
Giorgi (IEW, former member
of RiskLab), Vlatka Komaric (Credit Suisse Group), Jürg Burkhard (FAME) 

Type:  Published
Paper



Project:  Risk Modelling for a Swiss
Retail/Middle Market Loan Portfolio 


Reference:  Wilmott Magazine, July 2004  

Wavelet Galerkin Pricing of American Options on Lévy Driven Assets  
Authors:  Dr. AnaMaria Matache
(RiskLab and SAM, ETH Zürich), PálAndrej Nitsche (SAM, ETH Zürich), Prof. Dr. Christoph Schwab (SAM, ETH Zürich) 

Abstract:  The price of an American style contract on assets driven by Lévy processes with infinite jump activity is expressed as solution of a parabolic variational integrodifferential inequality (PIDI). A Galerkin discretization in logarithmic price using a wavelet basis is presented with compression of the moment matrix of the jump part of the price process' Dynkin operator. An iterative solver with wavelet preconditioning for the resulting large matrix inequality problems is presented and its efficiency is demonstrated by numerical experiments.  
Keywords:  Lévy processes, integrodifferential operators, variational inequalities, Galerkin discretization, biorthogonal wavelet basis, wavelet preconditioning  
Date:  July 18, 2003  
Type:  Preprint  

Project:  Fast Deterministic Computation of Valuations for Assets Driven by Lévy Processes  

Size:  26 pages  
Textfiles:  Postscript
(2783 KBytes, US letter format) Compressed Postscript (gzip, 260 KBytes, US letter format) Portable Document Format (1238 KBytes, US letter format) 


On the First Passage Times of Generalized OrnsteinUhlenbeck Processes  
Author:  Pierre Patie (RiskLab, ETH Zürich)  
Abstract:  We study the twodimensional joint distribution of the first passage time of a constant level by spectrally negative generalized OrnsteinUhlenbeck processes and their primitive stopped at this first passage time. We show an explicit expression of the Laplace transform of the distribution in terms of new special functions. Finally, we give an application in finance which consists on computing the Laplace transform of the price of a European call option on the maximum on the yield in the generalized Vasicek model. The stable case is studied in more details.  
Date:  April 2003  
Type:  Submitted preprint  
Keywords:  Generalized OrnsteinUhlenbeck process, stable process, first passage time, martingale, special function, term structure, path dependent option  
MSC 2000:  Primary 60G44, 60G51 Secondary 33C05,91B70 

Size:  18 pages  
Textfile:  Postscript
(1002 KBytes) Compressed Postscript (gzip, 574 KBytes) Portable Document Format (257 KBytes) 


An Optimisation Algorithm for the Volatility Forecasting of FX Rates with the Stochastic Cascade Model  
Author:  David Mac Audière (visiting researcher at RiskLab)  
French title:  Un algorithme d'optimisation pour la prédiction la volatilité des cours des changes avec le Modèle en Cascade Stochastique (MCS)  
Abstract:  The aim of this work was to develop a fast numeric algorithm to approximate a weighted sum of N multivariate Gaussians by a weighted sum of K Gaussians with K << N. This algorithm is to be used in a model for forecasting volatility. The volatility is decomposed into a product of lognormally distributed random factors which model the parts of the volatility corresponding to the different time horizons. The volatility dynamics results from the dynamics of the set of factors. A Bayesian approach is used for the inference of the distribution of the factors from observed data. It is expected that by means of a good optimisation algorithm it will be possible to take advantage of the full power of this approach.  
Given the fact that the dimensionality of the Gaussians is around ten, N may be of the order of one hundred and K of the order of ten. This is an optimisation problem with several hundred parameters, and the evaluation of a single point in the parameter space amounts to computing the norm of a 10dimensional function.  
We took the following approach: The original function was sampled and this sample was treated as observed data. Then the loglikelihood function was evaluated in the limit of an infinitely large sample. The limiting function is an integral that was evaluated numerically and optimised by means of a steepest descend optimisation procedure.  
The bulk of this work was devoted to the implementation of the numerical algorithm. The test results looked promising, but due to a lack of time the optimisation algorithm has not been tested withing the forecasting model.  
Date:  2002  
Type:  Project report for the year 2001/02, E.N.S.T.A.  
Supervision:  PD Dr. Wolfgang Breymann (RiskLab)  
Project:  Volatility Estimation and Risk Measurement: From Short to LongTime Horizons  
Language:  French  
Textfiles:  Available upon request.  

Risk Management in Credit Risk Portfolios with Correlated Assets  
Author:  Prof. Dr. Nicole Bäuerle (Institute for Mathematical Stochastics, Universität Hannover, former visiting postdoctoral research fellow at RiskLab)  
Abstract:  We consider a structural model of credit risk in the spirit of Merton [Journal of Finance 29 (1974) 449], where a firm defaults when its market value falls below the value of its debts or a certain given threshold level. Models of this type have extensively been used for valuing defaultrisky securities. Our research now focuses on the effect which positive dependence between credit risks in a portfolio has on the risk of the lending institute. In order to allow for unexpectedly defaults, we suppose that the firm's asset value follows a geometric Lévy process. The following two main results are obtained: a positive dependence in terms of association between the credit risks always leads to a higher risk for the lending institute than independent credit risks, where the risk of the institute can be measured by any of the following risk measures: variance, upper partial moments or risk sensitive measure. In a second part we investigate the influence of the portfolio structure. We suppose that a firm's asset value is influenced by an idiosyncratic risk, a sector specific risk and a systematic risk. Sectors can be defined, e.g., by the type of industry or geographic region. We show that whenever a sector structure majorizes another, the risk for the lending institute increases. This proves in particular a positive effect of diversification.  
Keywords:  Structural credit risk model, association, positive dependence, Lévy process  
Date:  March 1, 2001 (First version)  
Type:  Published paper  
Projects:  Combined Market and Credit Risk Stress Testing  
Dependence Modelling in Risk Management  
Remark:  The paper was initiated during the RiskLab visit.  
Reference:  Insurance: Mathematics and Economics 30, no. 2 (2002) 187198.  

A Utility Maximization Model of Capital Budgeting with Default Risk and Regulatory Constraints  
Authors:  Aydin Akgün
(RiskLab, Swiss
Banking Institute, ZKB and FAME) Prof. Dr. Rajna Gibson (Swiss Banking Institute, University of Zürich) 

Abstract:  The paper focuses on the capital allocation and budgeting decisions of banks from a shareholder utility maximization perspective and in the presence of nontradeable risks. The stylised model is similar to that of Froot and Stein (1998) who introduce shareholder value maximization to formally study capital budgeting as opposed to customdesigned methods such as RAROC and EVA used in practice. The presence of default risk and bankruptcy costs, and the introduction of regulatory constraints in the form of trading limits in our context as an additional rationale for risk management is novel as well as the analysis of how the investment, and capital allocation decisions of banks are influenced by such an introduction. Moreover, in this framework, one can separately analyse investment and capital allocation decisions regarding the trading book and the banking book.  
JEL Codes:  G21, G31, G38  
Date:  January 2003  
Type:  Preprint  
Project:  Capital Allocation under Regulatory Constraints  
Size:  16 pages  
Textfiles:  Available soon  

A Clarification Note about Hitting Times Densities for OrnsteinUhlenbeck Processes  
Authors:  Dr. Anja GöingJaeschke
(RiskLab, ETH
Zürich) Prof. Marc Yor (Université Pierre et Marie Curie, Laboratoire de Probabilités) 

Abstract:  In this note, we point out that the formula given in the correction note by Leblanc et al. for the distribution of the first hitting time of b by an OrnsteinUhlenbeck process starting from a is only true in case b = 0.  
Keywords:  Hitting time, OrnsteinUhlenbeck process  
JEL Codes:  E43, G13  
MSC 1991:  60E10, 60G17, 60J70, 65U05  
Type:  Paper  
Reference:  Finance and Stochastics, Vol. 7/3 (2003) pp. 413415 (to appear).  
Project:  Generalizations of Bessel Processes  
Size:  3 pages  
Textfiles:  Available soon  

Extreme Values of Gaussian
Processes and A Heterogeneous Multi Agents Model 

Author:  Christoph M. Schmid (Department of Mathematical Statistics and Actuarial Science, University of Bern)  
Abstract:  This PhD thesis consists of two parts that both can be put into a financial context. In the first part we consider the extreme behavior of a certain class of Gaussian processes. We are interested in the probability that a linear combination of Gaussian processes and a deterministic trend exceed a given high boundary. The result of this stochastic problem can be interpreted as the ruin probability of a portfolio of assets. In the second part we introduce the Heterogeneous Multi Agents Model which describes the price process of a virtual foreign exchange market. The goals are to reproduce the statistical properties of price time series of real foreign exchange markets and to support the hypothesis of a heterogeneous market.  
Date:  January 30, 2003  
Type:  Ph.D. thesis  

Supervision:  Prof. Dr. Jürg Hüsler (Department of Mathematical Statistics and Actuarial Science, University of Bern)  

For the second part:  

PD Dr. Wolfgang Breymann (RiskLab)  

Project:  Volatility Estimation and Risk Measurement: From Short to LongTime Horizons  

Size:  204 pages  
Textfile:  Postscript
(22.4 MBytes) Compressed Postscript (gzip, 3.7 MBytes) Portable Document Format (4.7 MBytes) 


Risk Management Strategies
for Banks 

Authors:  Wolfgang Bauer
(RiskLab, ETH Zürich,
and Swiss Banking Institute,University of Zürich) Marc Ryser (ECOFIN Research and Consulting, Zürich) 

Abstract:  We analyze optimal risk management strategies of a bank financed with deposits and equity in a oneperiod model. The bank's motivation for risk management comes from deposits which can lead to bank runs. In the event of such a run, liquidation costs arise. The hedging strategy that maximizes the value of equity is derived. We identify conditions under which wellknown results such as complete hedging, maximal speculation or irrelevance of the hedging decision are obtained. The initial debt ratio, the size of the liquidation costs, regulatory restrictions, the volatility of the risky asset and the spread between the riskless interest rate and the deposit rate are shown to be the important parameters that drive the bank's hedging decision. We further extend this basic model to include counterparty risk constraints on the forward contract used for hedging.  
Keywords:  bank, bank risk management, corporate hedging  
JEL Codes:  G1, G21, G28  
Date:  November 12, 2002  
Type:  Paper  

Reference:  Journal of Banking and Finance, 28(2), p. 331352, 2004  

Project:  Banks' Optimal Hedging Decisions Under the Threat of Bank Runs  

Size:  26 pages  
Textfile:  Postscript
(783 KBytes) Compressed Postscript (gzip, 319 KBytes) Portable Document Format (257 KBytes) 


Dependence Structures for
Multivariate HighFrequency Data in Finance 

Authors:  PD Dr.
Wolfgang Breymann (RiskLab) Alexandra Dias (Department of Mathematics, ETH Zürich) Prof. Dr. Paul Embrechts (RiskLab and Department of Mathematics, ETH Zürich) 

Abstract:  Stylised facts for univariate highfrequency data in finance
are wellknown. They include scaling behaviour, volatility clustering,
heavy tails, and seasonalities. The multivariate problem, however, has
scarcely been addressed up to now. In this paper, bivariate series of highfrequency FX spot data for major FX markets are investigated. First, as an indispensable prerequisite for further analysis, the problem of simultaneous deseasonalisation of highfrequency data is addressed. In the bulk of the paper we analyse in detail the dependence structure as a function of the time scale. Particular emphasis is put on the tail behaviour, which is investigated by means of copulas and spectral measures. 

Date:  October, 2002; Revision January 2003  
Type:  Paper  

Reference:  Quantitative
Finance, 3(1), 114, 2003 


Project:  Volatility Estimation and Risk Measurement: From Short to LongTime Horizons  

Size:  28 pages  
Textfile:  Postscript
(5.1 MBytes) Compressed Postscript (gzip, 1.8 MBytes) Portable Document Format (2.8 MBytes) 


RewardRisk Portfolio
Selection and Stochastic Dominance 

Author:  Enrico De Giorgi (RiskLab and IEW)  
Abstract:  The portfolio selection problem is traditionally modelled by two different approaches. The first one is based on an axiomatic model of riskaverse preferences, where decision makers are assumed to possess an expected utility function and the portfolio choice consists in maximizing the expected utility over the set of feasible portfolios. The second approach, first proposed by Markowitz (1952), is very intuitive and reduces the portfolio choice to a set of two criteria, reward and risk, with possible tradeoff analysis. Usually the rewardrisk model is not consistent with the first approach, even when the decision is independent from the specific form of the riskaverse expected utility function, i.e. when one investment dominates another one by second order stochastic dominance. In this paper we generalize the rewardrisk model for portfolio selection. We define reward measures and risk measures by giving a set of properties these measures should satisfy. One of these properties will be the consistency with second order stochastic dominance, to obtain a link with the expected utility portfolio selection. We characterize reward and risk measures and we discuss the implication for portfolio selection.  
Keywords:  stochastic dominance, coherent risk measure, decision under risk, meanrisk models, portfolio optimization  
JEL Code:  G11  
Date:  August 13, 2002  
Type:  RiskLab Paper, accepted for publication by the Journal
of Banking and Finance; IEW Working Paper, No. 121. 


Prize:  The paper has won the Young Economist Award 2003 of the Central Bank of the Republic of Turkey (CBRT). The prize was awarded at the erc/METU International Conference in Economics in Ankara, Turkey.  

Project:  Liquidity Shocks in an Evolutionary Portfolio Theory  

Size:  25 pages  
Textfiles:  Postscript
(344 KBytes) Compressed Postscript (gzip, 129 KBytes) Portable Document Format (310 KBytes) 


Modelling Financial Time
Series with a Multifractal Model 

Author:  Dr. Céline Azizieh (Visiting postdoctoral research fellow at RiskLab)  
Abstract:  In the classical BlackScholes model, if we define the (logarithmic) return of some financial asset at time t for the (fixed) time interval h by  
r(t,h) := ln S(t)  ln S(th)  
where S(t) denotes the price of the asset at time t, then these returns r(.,h) are i.i.d. Gaussian random variables. In practice however, the returns processes of a number of financial assets show different statistical properties. First, the returns distribution presents heavy tails, the tail index being an increasing function of the considered time interval h. A second observation is the long memory of the realized volatility process, which is directly linked with the so called phenomenon of volatility clustering. Another important observed fact is the presence of scaling laws: Define S_{q}(h) as the average of r(jh,h)^{q} with j running from 1 to N/h, where N denotes the length of the considered time series. Then S_{q}(h) seems proportional to h^{f(q)} for some concave function f(q), which suggests multifractality. For all those reasons, new models have been proposed, taking into account those observations.  
In this work, we are interested in modelisation of financial time series by fractal and even multifractal models. In a first mathematical part, we recall some notions relative to selfsimilar and multifractal processes. In a second part, we recall the preceding stylized facts (with illustration on some high frequency data from the foreign exchange market), and we introduce the stochastic cascade model proposed by BreymannGhashghaieTalkner (Journal of Theoretical and Applied Finance, 2000). This is a multifractal model based on an analogy with hydrodynamic turbulence and which tries to reproduce the preceding statistical properties observed in the data. We compare the model to the data and we attempt to estimate some parameters.  
Date:  August, 2002  
Type:  Diploma thesis for insurance mathematics, presented at the Université Libre de Bruxelles  

Supervision:  PD Dr.
Wolfgang Breymann (RiskLab) Prof. Dr. Pierre Devolder (Université Libre de Bruxelles) 


Project:  Volatility Estimation and Risk Measurement: From Short to LongTime Horizons  

Size:  142 pages  
Language:  French (Modélisation de séries financières par un modèle multifractal)  
Textfile:  Portable
Document Format (8.1 MBytes) Compressed Portable Document Format (gzip, 5.1 MBytes) 


Fast Deterministic Pricing of Options on Lévy Driven Assets  
Authors:  Dr. AnaMaria Matache
(RiskLab and SAM, ETH Zürich), Dr. Tobias von Petersdorff (Department of Mathematics,University of Maryland), Prof. Dr. Christoph Schwab (SAM, ETH Zürich) 

Abstract:  A partial integrodifferential equation (PIDE) d_{t}u + A[u] = 0 for European contracts on assets with general jumpdiffusion price process of Lévy type is derived. The PIDE is localized to bounded domains and the error due to this localization is estimated. The localized PIDE is discretized by the qscheme in time and a wavelet Galerkin method with N degrees of freedom in space. The full Galerkin matrix for A can be replaced with a sparse matrix in the wavelet basis, and the linear systems for each time step are solved approximatively with GMRES in linear complexity. The total work of the algorithm for M time steps is bounded by O(MN(ln N)^{2}) operations and O(N ln N) memory. The deterministic algorithm gives optimal convergence rates (up to logarithmic terms) for the computed solution in the same complexity as finite difference approximations of the standard BlackScholes equation. Computational examples for various Lévy price processes (Variance Gamma, CGMY) are presented.  
Date:  July 18, 2003  
Type:  Preprint  

Project:  Fast Deterministic Computation of Valuations for Assets Driven by Lévy Processes  

Size:  35 pages  
Textfiles:  Postscript
(2818 KBytes, US letter format) Compressed Postscript (gzip, 366 KBytes, US letter format) Portable Document Format (945 KBytes, US letter format) 


Coherent Multiperiod Risk
Measurement 

Authors:  Prof. Dr. Philippe Artzner (Université Louis
Pasteur Strasbourg and RiskLab) Prof. Dr. Freddy Delbaen (Department of Mathematics,ETH Zürich) Dr. JeanMarc Eber Prof. Dr. David Heath (Department of Mathematical Sciences, Carnegie Mellon University) Prof. Dr. Hyejin Ku (Department of Mathematics, University of North Carolina at Charlotte) 

Abstract:  We explain why and how to deal with the definition, acceptability, computation and management of risk in a genuinely multitemporal way. Coherence axioms provide a representation of a riskadjusted valuation. Some special cases of practical interest allowing for easy recursive computations are presented. The multiperiod extension of Tail VaR is discussed.  
Date:  February 13, 2002  
Type:  Paper  

Project:  Measures of Multiperiod Risk and Time Allocation of Capital  

Size:  13 pages  
Textfile:  Postscript
(647 KBytes) Compressed Postscript (gzip, 171 KBytes) Portable Document Format (169 KBytes) 


On Nonlinear Integral Equations Arising in Problems of Optimal Stopping  
Authors:  Dr. Jesper Lund Pedersen
(RiskLab, ETH Zürich) Prof. Dr. Goran Peskir,University of Aarhus) 

Abstract:  Available in pdf format  
Keywords:  Optimal stopping, finite horizon, Brownian motion, freeboundary problem, nonlinear integral equation, ItôTanaka formula, local time, curved boundary, firstpassage problem  
MSC 2000:  Primary 60G40, 35R35, 45G10; Secondary 60J65,60J60, 45G15  
Date:  2001  
Type:  Published Paper  

Reference:  Functional analysis, VII (Dubrovnik 2001), Various Publ. Ser. Vol. 46 (2002) 159175.  

The Minimum Maximum of a Continuous Martingale with Given Initial and Terminal Laws  
Authors:  Dr. David Hobson
(Department of Mathematical
Sciences, University of Bath,
UK), Dr. Jesper Lund Pedersen (RiskLab, ETH Zürich) 

Abstract:  Let (M_{t})_{0<=t<=1} be a continuous martingale with initial law M_{0} ~ µ_{0} and terminal law M_{1} ~ µ_{1} and let S = sup_{0<=t<=1} M_{t}. In this paper we prove that there exists a greatest lower bound with respect to stochastic ordering of probability measures, on the law of S. We give an explicit construction of this bound. Furthermore a martingale is constructed which attains this minimum by solving a Skorokhod embedding problem. The form of this martingale is motivated by a simple picture. The result is applied to the robust hedging of a forward start digital option.  
Keywords:  Continuous martingale, maximum process, stochastic domination, greatest lower bound, Brownian motion, Skorokhod embedding, excursion, digital option, robust hedging  
MSC 2000:  Primary 60G44, 60E15; Secondary 60J65.  
Date:  October 2001  
Type:  Published Paper  

Reference:  Annals of Probability, Vol. 30, No. 2 (2002) 978999.  
Size:  19 pages  
Textfiles:  Postscript
(947 KBytes) Compressed Postscript (gzip, 285 KBytes) Portable Document Format (254 KBytes) 


Importance Sampling Techniques for the Multidimensional Ruin Problem for General Markov Additive Sequences of Random Vectors  
Author:  Dr. Jeffrey F. Collamore (RiskLab)  
Abstract:  Let {(X_{n};S_{n}): n=0,1,...,} be a Markov additive process, where {X_{n}} is a Markov chain on a general state space and S_{n} is an additive component on R^{d}. We consider P{S_{n} in A/c; some n} as c tends to 0, where A is an open subset of R^{d} and the mean drift of {S_{n}} is away from A. Our main objective is to study the simulation of P{S_{n} in A/c; some n} using the Monte Carlo technique of importance sampling. If the set A is convex, then we establish: (i) the precise dependence (as c tends to 0) of the estimator variance on the choice of the simulation distribution; (ii) the existence of a unique simulation distribution which is efficient and optimal in the asymptotic sense of Siegmund (1976). We then extend our techniques to the case where A is not convex. Our results lead to positive conclusions which complement the multidimensional counterexamples of Glasserman and Wang (1997).  
Date:  Summer 2001  
Type:  Paper  
Reference:  The Annals of Applied Probability 12 (2002) 382421  
Size:  35 pages  
Textfiles:  Available via Dr. J. Collamore's home page  

Dealing with Dangerous Digitals  
Authors:  Dr. Uwe Schmock
(RiskLab, ETH
Zürich) Prof. Steven E. Shreve (Carnegie Mellon University, USA) Dr. Uwe Wystup (Commerzbank Treasury and Financial Products, Germany) 

Abstract:  Options with discontinuous payoffs are generally traded above their theoretical BlackScholes prices because of the hedging difficulties created by their large delta and gamma values. A theoretical method for pricing these options is to constrain the hedging portfolio and incorporate this constraint into the pricing by computing the smallest initial capital which permits superreplication of the option. We develop this idea for exotic options, in which case the pricing problem becomes one of stochastic control. The high cost of exact superreplication coincides with market price quotations for dangerous derivatives such as reverse knockout barrier options, which are often higher than their riskneutral expected payoff (theoretical value). This paper illustrates how the theory of leverage constrained pricing can be successfully applied to compute closetomarket option values and serves as a practitioner's guide to derive explicit formulae and compute prices by finite difference methods.  
Date:  October 25, 2001  
Type:  Paper  
Reference:  Foreign Exchange Risk, Risk Publications, London 2001 (to appear)  
Size:  25 pages including 6 figures  
Textfiles:  Portable
document format (416 kB) 


On the Coherence of Expected Shortfall  
Authors:  Carlo Acerbi (AbaXBank, Italy) and Dr. Dirk Tasche (RiskLab)  
Abstract:  Expected Shortfall (ES) in several variants has been proposed as remedy for the deficiencies of ValueatRisk (VaR) which in general is not a coherent risk measure. In fact, most definitions of ES lead to the same results when applied to continuous loss distributions. Differences may appear when the underlying loss distributions have discontinuities. In this case even the coherence property of ES can get lost unless one took care of the details in its definition. We compare some of the definitions of expected shortfall, pointing out that there is one which is robust in the sense of yielding a coherent risk measure regardless of the underlying distributions. Moreover, this expected shortfall can be estimated effectively even in cases where the usual estimators for VaR fail.  
Keywords:  expected shortfall; risk measure; worst conditional expectation; tail conditional expectation; valueatrisk (VaR); conditional valueatrisk (CVaR); tail mean; coherence; quantile; subadditivity  
Date:  September 12, 2001  
Type:  Working paper  
Size:  18 pages  
Textfiles:  Postscript
(436 KBytes) Portable Document Format (268 KBytes) 


An Academic Response to Basel II  
Authors:  Dr. Jón Daníelsson (LSE/FMG), Prof. Paul Embrechts (RiskLab and Dept. of Mathematics, ETH Zürich), Prof. Charles Goodhart (LSE/FMG), Con Keating (Finance Development Centre), Felix Muennich (LSE/FMG), Dr. Olivier Renault (LSE/FMG), Prof. Hyun Song Shin (LSE/FMG/CEP) 

Abstract:  It is our view that the Basel Committee for Banking Supervision, in its Basel II proposals, has failed to address many of the key deficiencies of the global financial regulatory system and even created the potential for new sources of instability.  
Date:  May 2001  
Type:  Paper  
Reference:  ISSN 13599151130  
Size:  17 pages  
Textfiles:  Postscript
(320 KBytes) Portable Document Format (168 KBytes) 


An Intensity Based NonParametric Default Model for Residential Mortgage Portfolios  
Author:  Enrico De Giorgi (RiskLab)  
Abstract:  In April 2001 Swiss banks held over CHF 500 billion in mortgages. This important segment accounts for about 63% of all the loan portfolios of Swiss banks. In this paper we restrict our attention to residential mortgages held by private clients, i.e. borrowers who finance their property by the loan and we model the probability distribution of the number of defaults using a nonparametric intensity based approach. We consider the timetodefault and, by conditioning on a set of predictors for the default event, we obtain a logadditive model for the conditional intensity process of the timetodefault, where the contribution of each predictor is described by a smooth function. We estimate the model by using a local scoring algorithm coming from the generalized additive model.  
Date:  November 22, 2001  
Type:  Submitted preprint  
Remark:  Paper accepted for a contributed talk by the Scientific Committee of the Bachelier Finance Society 2002 Congress  
Project:  Risk Modelling for a Swiss Retail/Middle Market Loan Portfolio  
Size:  38 pages  
Textfiles:  Postscript
(1783 KBytes) Compressed Postscript (gzip, 392 KBytes) Portable Document Format (509 KBytes) 

Slides:  27 slides used for talk at the
Risk Day 2001 Postscript (1362 KBytes) Compressed Postscript (gzip, 219 KBytes) Portable Document Format (604 KBytes) 


Risk Management for Derivatives in Illiquid Markets: A Simulation Study  
Authors:  Prof. Dr. Rüdiger Frey (Swiss Banking Institute,University of Zürich) and Pierre Patie (RiskLab)  
Abstract:  In this paper we study the hedging of derivatives in illiquid markets. More specifically we consider a model where the implementation of a hedging strategy affects the price of the underlying security. Following earlier work we characterize perfect hedging strategies by a nonlinear version of the BlackScholes PDE. The core of the paper consists of a simulation study. We present numerical results on the impact of market illiquidity on hedge cost and Greeks of derivatives. We go on and offer a new explanation of the smile pattern of implied volatility related to the lack of market liquidity. Finally we present simulations on the performance of different hedging strategies in illiquid markets.  
MSC 2000:  91B28 Finance, portfolios, investment  
JEL Codes:  G12, G13  
Date:  November 26, 2001  
Reference:  Contribution to the forthcoming book Klaus Sandmann and Philipp Schönbucher (Editors) Advances in Finance and Stochastics SpringerVerlag, Berlin, Heidelberg, New York (2002) 

Project:  Risk Management for Derivatives with Market Illiquidities  
Size:  19 pages  
Textfiles:  Postscript
(647 KBytes) Compressed Postscript (gzip, 244 KBytes) Portable Document Format (227 KBytes) 

Slides:  27 slides used for talk in Frankfurt MathFinance Colloquium
on Nov. 22, 2001: Postscript (392 KBytes) Compressed Postscript (gzip, 107 KBytes) Portable Document Format (177 KBytes) 


Combined Market and Credit Risk Stress Testing based on the Merton Model  
Authors:  Dr. Maria Kafetzaki Boulamatsis and Dr. Dirk Tasche (RiskLab)  
Abstract:  On the basis of the Merton model for the value of a firm¹s debt we discuss the problem of stress testing the effects of credit, market and foreign exchange risk on the portfolio of a financial institution. There is no straightforward solution to this problem since the Merton model links only equity and firm value for a single firm, but includes neither interest rate volatility nor foreign exchange rates.  
Date:  June 15, 2001  
Type:  RiskLab report  
Project:  Combined Market and Credit Risk Stress Testing  
Size:  27 pages  
Textfiles:  Postscript
(571 KBytes) Compressed Postscript (gzip, 237 KBytes) Portable Document Format (345 KBytes) 

Slides:  28 slides used for talk
on Feb. 28, 2001: Postscript (324 KBytes) Compressed Postscript (gzip, 95 KBytes) Portable Document Format (112 KBytes) 


Multivariate Extremes, Aggregation and Dependence in Elliptical Distributions  
Authors:  Henrik Hult (Department of Mathematics, KTH, Stockholm) and Filip Lindskog (RiskLab)  
Abstract:  In this paper we clarify dependence properties of elliptical distributions by deriving general but explicit formulas for the coefficients of upper and lower tail dependence and spectral measures with respect to different norms. We show that an elliptically distributed random vector is regularly varying if and only if the bivariate marginal distributions have tail dependence. Furthermore, the tail dependence coefficients are fully determined by the tail index of the random vector (or equivalently of its components) and the linear correlation coefficient. Whereas Kendall's tau is invariant in the class of elliptical distributions with continuous marginals and a fixed dispersion matrix, we show that this is not true for Spearman's rho. We also show that sums of elliptically distributed random vectors with the same dispersion matrix (up to a positive constant factor) remain elliptical if they are dependent only through their radial parts.  
Date:  September 12, 2001  
Type:  Submitted preprint  
Project:  Dependence Modelling in Risk Management  
Size:  21 pages  
Textfiles:  Postscript
(534 KBytes) Compressed Postscript (gzip, 229 KBytes) Portable Document Format (273 KBytes) 

Slides:  21 slides used at Risk Day 2001: Postscript (524 KBytes) Compressed Postscript (gzip, 194 KBytes) Portable Document Format (378 KBytes) 


Bewertung von Kreditrisiken  empirische Untersuchungen am Schweizer Kapitalmarkt  
Author:  Jacqueline Henn (RiskLab and s/bf, HSG)  
Date:  2001  
Reference:  Ph.D. thesis no. 2493, Univ. St. Gallen (available via NEBIS Library Catalogue)  
Project:  Investigation of the Market Price of Credit Risk for the Swiss Bond Market  
Size:  139 pages  

Defaultable Security Valuation and Model Risk  
Author:  Aydin Akgün (RiskLab and Swiss Banking Institute)  
Abstract:  The aim of the paper is to analyse the effects of different model specifications, within a general nested framework, on the valuation of defaultable bonds, and some credit derivatives. Assuming that the primitive variables such as the riskfree short rate, and the credit spread are affine functions of a set of state variables following jumpdiusion processes, effcient numerical solutions for the prices of several defaultable securities are provided. The framework is flexible enough to permit some degree of freedom in specifying the interrelation among the primitive variables. It also allows a richer economic interpretation for the default process. The model is calibrated, and a sensitivity analysis is conducted with respect to parameters defining jump terms, and correlation. The effectiveness of dynamic hedging strategies are analysed as well.  
JEL Codes:  G13, G19  
Date:  March 2001  
Type:  Preprint  
Project:  Model Risk for Credit and Market Risk Sensitive Securities  
Size:  59 pages  
Textfiles:  Postscript
(7434 KBytes) Compressed Postscript (gzip, 833 KBytes) Portable Document Format (949 KBytes) 


Kendall's Tau for Elliptical Distributions  
Authors:  Filip Lindskog
(RiskLab), Dr. Alexander McNeil (Department of Mathematics, ETHZ), Dr. Uwe Schmock (RiskLab) 

Abstract:  By using well known properties of elliptical distributions we show that the relation between Kendall's tau and the linear correlation coefficient for bivariate normal distributions holds more generally (subject to only slight modifications) for the class of elliptical distributions. We mention applications of this result to calibrating elliptical distributions and their copulas in the context of multivariate financial time series models and portfolio credit risk models in particular.  
Keywords:  Robust estimation; linear correlation; Kendall's tau; elliptical distributions  
MSC 2000:  60E05 (primary); 62H20 and 62J10 (secondary)  
Date:  February 1, 2001 (updated June 26, 2001; Remark 2 added August 7, 2002; discussion of credit risks added Nov. 28, 2002)  
Type:  Published paper  
Reference:  Credit Risk Measurement, Evaluation and Management Editors: Georg Bol, Gholamreza Nakhaeizadeh, Svetlozar T. Rachev, Thomas Ridder, KarlHeinz Vollmer PhysicaVerlag, A SpringerVerlag Company, Heidelberg 2003 pages 149156 

Project:  Dependence Modelling in Risk Management  
Size:  7 pages  
Textfiles:  Postscript
(835 KBytes) Compressed Postscript (gzip, 291 KBytes) Portable Document Format (528 KBytes) 


Rules of Capital Allocation and Coherent Measures of Risk (CAPA)  
Author:  Prof. Dr. Philippe Artzner (Université Louis Pasteur and RiskLab)  
Abstract:  The examination of cases mentioned by practitioners has lead to consider multiperiod risk as an important subject in studying allocation of risk capital. It is better for a measurement of risk to take into account the historical development of the cash flows or of the market value of the final wealth. Traditional risk measures of the final worth of a position, including TailVar, fail to do this. Using the idea behind the scenarios method, a backward recurrence scheme is used to define risk adjusted values of a future terminal position, at dates closer and closer to the one where risk capital has to be defined; the latter will be the negative of the adjusted value at this initiation date. A specific method, which relates to the idea of price of risk, is presented and applied to several examples: time evolution of risk, comparison of various measures, discrepancy between regulatory and managerial approaches, funding liquidity.  
A (still confidential) working paper Acceptability of Multiperiod Risk will be made available later.  
Date:  February 16, 2000  
Type:  Report  
Project:  Rules of Capital Allocation (CAPA) and Coherent Measures of Risk  
Size:  3 pages  
Textfiles:  Postscript
(200 KBytes) Compressed Postscript (gzip, 75 KBytes) Portable Document Format (63 KBytes) 


Worst Case Model Risk Management  
Authors:  Dr.
Denis Talay (INRIA
SophiaAntipolis, France), Zheng Ziyu (INRIA SophiaAntipolis, France, and RiskLab) 

Abstract:  We are interested in model risk control problems. We study a strategy for the trader which, in a sense, guarantees good performances whatever is the unknown model for the assets of his/her portfolio. The trader chooses trading strategies to decrease the risk and therefore acts as a minimizer; the market systematically acts against the interest of the trader, so that we consider it acts as a maximizer. Thus we consider the model risk control problem as a two players (Trader versus Market) zerosum stochastic differential game problem. Therefore our construction corresponds to a `worst case' worry and, in this sense, can be viewed as a continuoustime extension of discretetime strategies based upon prescriptions issued from VaR analyses at the beginning of each period. In addition, the initial value of the optimal portfolio can be seen as the minimal amount of money which is needed to face the worst possible damage. We give a proper mathematical statement for such a game problem. We prove that the value function of this game problem is the unique viscosity solution to an HamiltonJacobiBellmanIsaacs (HJBI) equation, and satisfies the Dynamic Programming Principle.  
Keywords:  Model risk, stochastic differential game, HamiltonJacobiBellmanIsaacs equation  
JEL Code:  G11  
MSC 2000:  91B28
Finance,
portfolios, investment 60H30 Applications of stochastic analysis (to PDE, etc.) 91A15 Stochastic games 91A23 Differential games 

Date:  November 2000; final version February 2002  
Type:  Published paper  
Reference:  Finance and Stochastics, Volume 6, Issue 4 (2002) pp. 517537  
Project:  Model Risk Management for Interest Rate Derivatives  
Textfiles:  Springer LINK: Finance and Stochastics  

Modelling Dependencies in Credit Risk Management  
Author:  Mark A. Nyfeler  
Abstract:  We commence with an overview of the three most widely used credit risk models developed by KMV, J.P. Morgan (CreditMetrics) and Credit Suisse First Boston (CreditRisk^{+}). The mathematical essentials of each model lie in the way the joint distribution of the socalled 'default indicators' is modeled, a vector of Bernoulli random variables. With the focus on these vectors we investigate two general frameworks for modelling such binary random events. We also show how the KMV and CreditMetrics methodology can be translated into the framework of CreditRisk^{+}.  
The credit risk models are then compared for 'homogeneous' portfolios using Monte Carlo simulation. As two of the three models use the multivariate normal distribution for their 'latent variables', we investigate the impact when proceeding to the broader class of elliptical distributions. A socalled tmodel, incorporating a tcopula for the latent vector, is used to show the consequences of a possible generalisation. In this context we introduce the notion of tail dependence. Comparison of the extended tmodel with the 'normal' two credit risk models is again performed for the same types of portfolios used for the previous comparison.  
Lastly, we study the portfolio loss distributions for the various models due to increased portfolio size.  
Date:  November 23, 2000  
Type:  Revised diploma thesis under the supervision of Prof. Paul Embrechts and Prof. Dr. Rüdiger Frey  
Project:  Connected to the projects Risk Modelling for a Swiss Retail/Middle Market Loan Portfolio and Dependence Modelling in Risk Management. 

Size:  78 pages  
Textfiles:  Postscript
(5968 KBytes) Compressed Postscript (gzip, 3699 KBytes) Portable Document Format (3187 KBytes) 

Paper:  Modelling Dependent Defaults: Asset Correlations Are Not Enough!  
Authors:  Prof. Dr. Rüdiger Frey,
(Swiss Banking Institute,University of Zürich) Dr. Alexander McNeil (Department of Mathematics,ETH Zürich) Mark A. Nyfeler (Investment Office RTC, UBS Zürich) 

Size:  8 pages  
Textfiles:  Postscript
(453 KBytes) Compressed Postscript (gzip, 194 KBytes) Portable Document Format (275 KBytes) 


Common Poisson Shock Models: Applications to Insurance and Credit Risk Modelling  
Authors:  Filip Lindskog (RiskLab) and Dr. Alexander McNeil (Department of Mathematics, ETHZ)  
Abstract:  The idea of using common Poisson shock processes to model dependent event frequencies is well known in the reliability literature. In this paper we examine these models in the context of insurance loss modelling and credit risk modelling. To do this we set up a very general common shock framework for losses of a number of different types that allows for both dependence in loss frequencies across types and dependence in loss severities. Our aims are threefold: to demonstrate that the common shock model is a very natural way of approaching the modelling of dependent losses in an insurance or risk management context; to provide a number of analytical results concerning the nature of the dependence implied by the common shock specification; to examine the aggregate loss distribution that results from the model and the sensitivity of its tail to the specification of the model parameters.  
Date:  September 13, 2001  
Type:  Submitted preprint  
Project:  Dependence Modelling in Risk Management  
Size:  22 pages  
Textfiles:  Postscript
(480 KBytes) Compressed Postscript (gzip, 199 KBytes) Portable Document Format (247 KBytes) 

Slides:  33 slides used in London and New York, November 2000, for a
talk on 'Modelling Dependent Credit Risks': Postscript (804 KBytes) Compressed Postscript (gzip, 225 KBytes) Portable Document Format (288 KBytes) 


Volatility Model Risk Measurement and Strategies against Worst Case Volatilities  
Authors:  Dr. Mireille Bossy
(INRIA
SophiaAntipolis, France), Prof. Dr. Rajna Gibson (Swiss Banking Institute, University of Zürich), FrançoisSerge Lhabitant (UBS AG), Prof. Nathalie Pistre (ENSAE, France), Dr. Denis Talay (INRIA SophiaAntipolis, France), Zheng Ziyu (INRIA SophiaAntipolis, France, and RiskLab) 

Abstract:  The paper is a synthesis of three papers on volatility model risk.  
We first describe a Monte Carlo procedure to compute coherent measures of the risk implied by the choice of an erroneous univariate model. It measures the distribution of the losses due to this error, including (but not reduced to) estimation errors. It also takes into account hedging errors in addition to pricing errors (that can be avoided by the calibration of a wrong model on true market data). We state error estimates for the approximation of quantiles of marginal laws of diffusion processes, which is the problem arising in the statistical study of the Profit and Loss of a misspecified hedging strategy.  
This methodology is however restricted to the comparison of one (potentially incorrect) model against one or several (possible true) models among a class of univariate Markov models. We present a more general methodology which aims at selecting a hedging strategy which minimizes the expected utility of the loss due to model risk under the worst possible movements of nature (as characterized by forward rates' volatility trajectories).  
We focus on the case of bond option hedging, but the methodology and the results are relevant for many other financial problems.  
Date:  October 31, 2000  
Type:  Published paper  
Reference:  Journal of the French Society of Statistics, Vol. 141, No. 12 (2000) 7386.  
Project:  Model Risk Management for Interest Rate Derivatives  
Size:  18 pages  
Textfiles:  Postscript
(819 KBytes) Compressed Postscript (gzip, 191 KBytes) Portable Document Format (495 KBytes) 

Slides:  27 slides used by Zheng Ziyu
at the Risk Day 2000,
October 20, 2000: Postscript (1801 KBytes) Compressed Postscript (gzip, 475 KBytes) Portable Document Format (303 KBytes) 


On the Normality of LongTerm Financial LogReturns  
Author:  Olaf Martin Strub  
Abstract:  In this paper we study longterm financial logreturns. The paper is divided into two parts. In the first part we give an overview of models of changing variance and covariance, in particular the class of ARCH/GARCH models and their extensions. Furthermore, we discuss some aspects of temporal aggregation of logreturns. Based on the additivity of the logarithm, longterm logreturns are obtained by simply summing up the single higherfrequency logreturns. It is a main aim of this paper to show that the central limit theorem holds, i.e., that the distribution of the aggregated longterm logreturns tends with increasing lag towards the normal distribution.  
In the second part we introduce various tests for serial correlation and normality and discuss their use for diagnosing problems such as lack of stationarity. Finally we investigate the behaviour of several real financial time series. For various classes of financial logreturns, namely single stock returns, total stock returns, exchange rates, interest rates, stock market price indices, stock yields as well as bills total returns, we study whether the hypothesis of normality holds, for different lag sizes.  
Date:  August 25, 2000  
Type:  Diploma thesis under the supervision of Prof. Paul Embrechts,Dr. Alexander McNeil and Dr. Uwe Schmock  
Project:  Strategic LongTerm Financial Risks  
Size:  104 pages  
Textfiles:  Probably available soon.  

Expected RiskAdjusted Return for Insurance Based Models  
Author:  Tatiana Solcà  
Abstract:  The management of an insurance company is continually faced with the task of balancing the conflicting interests of policy holders and share holders. The first are interested in the financial strength of the company, the others are more concerned with the return on equity. To satisfy these needs, the management selects profitable business and limits the company's risk.  
In this diploma thesis we consider a model for an insurance company which has several lines of business. Every line of business sells just one type of contract. The arising liabilities are modelled as a sum of i.i.d. claims plus a catastrophe claim which scales linearly with the number of insurance contracts sold. This catastrophe claim includes the uncertainty about the net insurance premium arising from parameter risk and model risk. The number of sold contracts per line of business can be modelled as a deterministic constant, a Poisson distributed random variable, or a sum of both.  
Within this model we investigate the expected riskadjusted return using the expected shortfall as well as standard deviation as a risk measure. In the case of expected shortfall, we discuss the limiting behaviour of the expected riskadjusted return as the number of contracts in each line of business tends to infinity. Under mild assumptions, this limit exists and is determined by the distribution of the catastrophe claims. The i.i.d. claims only enter through their expectation, i.e., their risk contribution diversifies away. For the standard deviation risk measure, we also solve a Markovitztype problem, i.e., we try to maximize the return given a fixed risk capital by selecting the number of sold contracts optimally.  
For our model, we also discuss capital allocation based on the expected shortfall principle and the covariance principle. For independent normally distributed claims, most quantities can be calculated explicitly.  
Date:  September 26, 2000  
Type:  Revised diploma thesis under the supervision of Prof. Paul Embrechts and Dr. Uwe Schmock  
Project:  Connected to the project Rules of Capital Allocation (CAPA) and Coherent Measures of Risk  
Size:  78 pages  
Textfiles:  Postscript
(2922 KBytes) Compressed Postscript (gzip, 625 KBytes) Portable Document Format (869 KBytes) 


Linear Correlation Estimation  
Author:  Filip Lindskog (RiskLab)  
Abstract:  Most financial models for modelling dependent risks are based on the assumption of multivariate normality and linear correlation is used as a measure of dependence. However, observed financial data are rarely normally distributed and tend to have marginal distributions with heavier tails. Furthermore, the observed synchronized extreme falls in financial markets can not be modelled by multivariate normal distributions. However, there are other elliptical distributions with these properties and the assumption of multivariate normality can often be replaced by the assumption of ellipticality. A useful property of elliptical distributions is that these distributions support the standard approaches of risk management. ValueatRisk fulfils the desired properties of a risk measure and the meanvariance (Markowitz) approach can be used for portfolio optimization.  
For elliptical distributions linear correlation is still a natural measure of dependence. However, a linear correlation estimator such as the Pearson productmoment correlation estimator (the standard estimator) being suitable for data from uncontaminated multivariate normal distributions has a very bad performance for heavier tailed or contaminated data. Therefore robust estimators are needed, robust in the sense of being insensitive to contamination and yet maintaining a high efficiency for heavier tailed elliptical distributions as well as for multivariate normal distributions.  
In this paper an overview of techniques for robust linear correlation estimation is given and the various techniques are compared for contaminated and uncontaminated elliptical distributions. In particular, an explicit relation between Kendall's tau and the linear correlation coefficient rho is shown to hold for (essentially) all elliptical distributions and the estimator of linear correlation provided by this relation is studied. This nonparametric estimator inherits the robustness properties of the Kendall's tau estimator and is an efficient (low variance) estimator for all elliptical distributions.  
Keywords:  robust estimation, linear correlation, Kendall's tau, elliptical distributions  
Date:  August 2, 2000 (updated Oct. 9; page 6 updated Dec. 11, 2000)  
Type:  RiskLab report  
Project:  Dependence Modelling in Risk Management  
Size:  35 pages  
Textfiles:  Postscript
(1752 KBytes) Compressed Postscript (gzip, 490 KBytes) Portable Document Format (611 KBytes) 


Strategic LongTerm Financial Risks: The OneDimensional Case  
Authors:  Roger Kaufmann and Pierre Patie (both RiskLab)  
Abstract:  The development of a methodology that could be used for the measurement of strategic longterm financial risks is becoming an important task. Existing modelling instruments allow for a good measurement of market risks of trading books over relatively small time intervals. However, these approaches might have some severe deficiencies if they are applied to longer time periods. In this paper we give an overview on methodologies that are proposed to model the evolution of risk factors over a long horizon. We investigate in detail the statistical properties and the behaviour of financial time series at different frequencies. Then, we test the different models on these data by backtesting expected shortfall predictions.  
Keywords:  Valueatrisk, expected shortfall, random walk models, vector auto regressive model, jumpdiffusion models, stochastic volatility models, extreme value theory, scaling rules.  
Date:  December 2, 2003 (first version July 12, 2000)  
Type:  Final project report  
Project:  Strategic LongTerm Financial Risks (SLTFR)  
Size:  61 pages  
Textfiles:  Postscript
(2327 KBytes) Compressed Postscript (gzip, 1015 KBytes) Portable Document Format (937 KBytes) 

Slides:  55 slides, used at workshop
at Swiss Re, June 22, 2000: Postscript (1009 KBytes) Compressed Postscript (gzip, 430 KBytes) Portable Document Format (587 KBytes) 27 slides, used at the Risk Day 2000, October 20, 2000: Postscript (251 KBytes) Compressed Postscript (gzip, 93 KBytes) Portable Document Format (300 KBytes) 23 slides, used at the RiskLab Workshop on Risk Management, June 7, 2001: Postscript (200 KBytes) Compressed Postscript (gzip, 80 KBytes) Portable Document Format (117 KBytes) 


Passport Options with Stochastic Volatility  
Authors:  Dr. Victoria Henderson
(RiskLab), Dr. David Hobson (Department of Mathematical Sciences,University of Bath, UK) 

Abstract:  A passport option is a call option on the profits of a trading account. In this article we investigate the robustness of passport option pricing by incorporating stochastic volatility. The key feature of a passport option is the holder's optimal strategy. It is known that in the case of exponential Brownian motion the strategy is to be long if the trading account is below zero and short if the account is above zero. Here we extend this result to models with stochastic volatility where the volatility is defined via an autonomous SDE. It is shown that for certain models of this type, the form of the optimal strategy remains unchanged. This means that pricing is robust to misspecification of the underlying model.  
A second aim of this article is to investigate some of the biases which become apparent in a stochastic volatility regime. Using an analytic approximation we are able to obtain comparisons for passport option prices using the exponential Brownian motion model and some well known stochastic volatility models. This is illustrated by a number of numerical examples. One conclusion is that fair prices are generally lower in a model with stochastic volatility than in a model with constant volatility.  
Keywords:  passport option, option pricing, stochastic volatility, Hull and White model  
Date:  March 31, 2000  
Type:  Preprint  
Reference:  Accepted for publication in "Applied Mathematical Finance"  
Size:  22 pages  
Textfiles:  Postscript
(1722 KBytes) Compressed Postscript (gzip, 371 KBytes) Portable Document Format (452 KBytes) 


Modelling Dependence with Copulas and Applications to Risk Management  
Authors:  Prof. Dr. Paul Embrechts
(Department of Mathematics, ETH Zürich) Dr. Alexander McNeil (Department of Mathematics, ETH Zürich) Filip Lindskog (RiskLab) 

Abstract:  The modelling of dependence is one of the most rucial issues in risk management. Whereas classically independence was equated to linear correlation, more recently, mainly due to extremal market moves, the limitations of the linear correlation concept were strongly felt. In order to stress test dependence in a financial or insurance portfolio, the notion of copula offers a versatile tool. In this paper, we recall some of the basic properties of copulas, discuss the underlying simulation and numerical issues, and highlight through examples the potential use of copula based techniques in integrated risk management.  
Keywords:  dependence, copulas, simulation, rank correlation, tail dependence, risk management  
Date:  September 10, 2001 (first version March 13, 2000)  
Origin:  Masters thesis of Filip Lindskog under the guidance of Dr. Alexander McNeil and Prof. Dr. Paul Embrechts  
Type:  Chapter of a forthcoming book  
Project:  Dependence Modelling in Risk Management  
Size:  50 pages  
Textfiles:  Postscript
(886 KBytes) Compressed Postscript (gzip, 355 KBytes) Portable Document Format (526 KBytes) 

Slides:  Postscript (825 KBytes) Compressed Postscript (gzip, 393 KBytes) Portable Document Format (448 KBytes) (35 slides, pages 2932 updated July 2, 2000) 


Model Risk with JumpDiffusion Processes  
Author:  Aydin Akgün (RiskLab and Swiss Banking Institute)  
Abstract:  The main goal of the study is to quantify the effects of model risk within the context of the risk management for interest rate derivatives. Assuming that the model risk emanates from the omission of the finite number of jump terms in the instantaneous forward rate specification, the profit and loss (P&L) function of a short position in a zerocoupon bond option is derived. A closedform solution for the bond option value is provided for the case where jumps are driven by a finite statespace compound Poisson process. Numerical methods are used to obtain the empirical distribution of the P&L function. The magnitude of model risk is determined in different contexts and a sensitivity analysis is conducted with respect to parameters such as moneyness, time to maturity, volatility, and intensity of jump terms.  
Date:  February 2000  
Type:  Published paper  
Project:  Model Model Risk Management for Interest Rate Derivatives  
Reference:  Pages 181  207 in MODEL RISK, Concepts, Calibration and Pricing Edited by Prof. Rajna Gibson Risk Books, ISBN 1 899 332 89 8 


Valuation of Exotic Options Under Shortselling Constraints  
Authors:  Dr. Uwe Schmock
(RiskLab, ETH
Zürich) Prof. Steven E. Shreve (Carnegie Mellon University, USA) Dr. Uwe Wystup (Commerzbank Treasury and Financial Products, Germany) 

Abstract:  Options with discontinuous payoffs are generally traded above their theoretical BlackScholes prices because of the hedging difficulties created by their large delta and gamma values. A theoretical method for pricing these options is to constrain the hedging portfolio and incorporate this constraint into the pricing by computing the smallest initial capital which permits superreplication of the option. We develop this idea for exotic options, in which case the pricing problem becomes one of stochastic control. Our motivating example is a call which knocks out in the money, and explicit formulas for this and other instruments are provided.  
Keywords:  Exotic options, superreplication, stochastic control  
MSC 2000:  91B28
Finance,
portfolios, investment 60H30 Applications of stochastic analysis (to PDE, etc.) 60G44 Martingales with continuous parameter 

JEL Code:  G13  
Date:  December 17, 1999 (revised version February 21, 2001)  
Type:  Paper  
Reference:  Finance and Stochastics 6, 143172 (2002)  
Size:  30 pages including 3 figures  
Textfiles:  Portable
document format (363 kB) Postscript format (1156 kB) compressed (gzipped) Postscript format (277 kB) 


Introduction to Dynamic Financial Analysis  
Authors:  Roger Kaufmann (RiskLab), Andreas Gadmer and Ralf Klett (Zurich Financial Services)  
Abstract:  In the last few years we have witnessed growing interest in Dynamic Financial Analysis (DFA) in the nonlife insurance industry. DFA combines many economic and mathematical concepts and methods. It is almost impossible to identify and describe a unique DFA methodology. There are some DFA software products for nonlife companies available in the market, each of them relying on its own approach to DFA. Our goal is to give an introduction into this field by presenting a model framework comprising those components many DFA models have in common. By explicit reference to mathematical language we introduce an upandrunning model that can easily be implemented and adjusted to individual needs. An application of this model is presented as well.  
Keywords:  nonlife insurance, dynamic financial analysis, asset/liability management, stochastic simulation, business strategy, efficient frontier, solvency testing, interest rate models, claims, reinsurance, underwriting cycles, payment patterns.  
Date:  April 26, 2001 (first version December 15, 1999)  
Project:  Dynamic Financial Analysis in Nonlife Insurance  
Type:  Published paper  
Reference:  Astin Bulletin, Vol. 31, No. 1 (May 2001) pages 213249.  
Comment:  The paper is partially based on Roger Kaufmann's diploma thesis written in cooperation with Zurich Financial Services. The diploma thesis was awarded the Walter Saxer Insurance Prize in the year 2000.  
Size:  32 pages  
Textfiles:  Postscript (689 KBytes) Compressed Postscript (gzip, 319 KBytes) Portable Document Format (372 KBytes) 


Modeling the Term Structure of Interest Rates: A Review of the Literature  
Authors:  Prof.
Dr. Rajna Gibson (Swiss Banking
Institute, University of Zürich) FrançoisSerge Lhabitant (UBS AG) Dr. Denis Talay (INRIA SophiaAntipolis, France) 

Abstract:  The last two decades have seen the development of a profusion of theoretical models of the term structure of interest rates. This study provides a general overview and a comprehensive comparative study of the most popular ones among both academics and practitioners. It also discusses their respective advantages and disadvantages in terms of bond and/or interest rate contingent claims continuous time valuation or hedging, parameter estimation, and calibration. Finally, it proposes a unified approach for model risk assessment. Despite the relatively complex mathematics involved, financial intuition rather then mathematical rigour is emphasised throughout. The classification by means of general characteristics should enable the understanding of the different features of each model, facilitate the choice of a model in specific theoretical or empirical circumstances, and allows the testing of various models with nested as well as nonnested specifications.  
Date:  October 1999, updated June 2001  
Project:  Model Risk Management for Interest Rate Derivatives  
Size:  97 pages  
Textfiles:  Postscript
(3029 KBytes) Compressed Postscript (gzip, 672 KBytes) Portable Document Format (713 KBytes) 


Correlation and Dependence in Risk Management: Properties and Pitfalls  
Authors:  Prof. Dr. Paul Embrechts
(DMath, ETH Zürich), Dr. Alexander McNeil (Swiss Re research fellow,DMath, ETH Zürich) Daniel Straumann (RiskLab) 

Abstract:  Modern risk management calls for an understanding of stochastic dependence going beyond simple linear correlation. This paper deals with the static (nontimedependent) case and emphasizes the copula representation of dependence for a random vector. Linear correlation is a natural dependence measure for multivariate normally and, more generally, elliptically distributed risks but other dependence concepts like comonotonicity and rank correlation should also be understood by the risk management practitioner. Using counterexamples the falsity of some commonly held views on correlation is demonstrated; in general, these fallacies arise from the naive assumption that dependence properties of the elliptical world also hold in the nonelliptical world. In particular, the problem of finding multivariate models which are consistent with prespecified marginal distributions and correlations is addressed. Pitfalls are highlighted and simulation algorithms avoiding these problems are constructed.  
Keywords:  risk management, correlation, elliptic distributions, rank correlation, dependence, copula, comonotonicity, simulation, ValueatRisk, coherent risk measures  
Date:  9. August 1999  
Type:  Paper  
Project:  Correlation in Insurance and Finance  
Size:  37 pages  
Textfiles:  Postscript
(1640 KBytes) Compressed Postscript (gzip, 431 KBytes) Portable Document Format (521 KBytes) 

Reference:  Short version published in  
Extremes
and Integrated Risk Management Edited by Prof. Paul Embrechts Foreword by Dr. Marcel Rohner (Chief Risk Officer, UBS) Risk Books (2000), pp. 7176, ISBN 1 899 332 74 X 

and in  
RISK 5 (1999), pp. 6971  

A Survey and Some Generalizations of Bessel Processes  
Authors:  Dr. Anja GöingJaeschke
(RiskLab, ETH
Zürich) Prof. Marc Yor (Université Pierre et Marie Curie, Laboratoire de Probabilités) 

Abstract:  Bessel processes play an important role in financial mathematics because of their strong relation to financial processes like geometric Brownian motion or CIR processes. We are interested in the first time Bessel processes and more generally, radial OrnsteinUhlenbeck processes hit a given barrier. We give explicit expressions of the Laplace transforms of first hitting times by (squared) radial OrnsteinUhlenbeck processes, i.e., CIR processes. As a natural extension we study squared Bessel processes and squared OrnsteinUhlenbeck processes with negative dimensions or negative starting points and derive their properties.  
Keywords:  First hitting times; CIR processes; Bessel processes; radial OrnsteinUhlenbeck processes; Bessel processes with negative dimensions  
Date:  1999  
Type:  Paper  
Reference:  Bernoulli, April 2003 (to appear)  
Project:  Generalizations of Bessel Processes  
Size:  38 pages  
Textfiles:  Postscript (1467 KBytes) Compressed Postscript (gzip, 400 KBytes) Portable Document Format (712 KBytes) 


Coherent Allocation of Risk Capital  
Author:  Dr. Michel Denault (IFOR and RiskLab)  
Abstract:  The allocation problem stems from the diversification effect observed in risk measurements of financial portfolios: the sum of the "risks" of many portfolios is larger than the "risk" of the sum of the portfolios. The allocation problem is to apportion this diversification advantage to the portfolios in a fair manner, yielding, for each portfolio, a risk appraisal that accounts for diversification.  
Our approach is axiomatic, in the sense that we first argue for the necessary properties of an allocation principle, and then consider principles that fulfill the properties. Important results from the area of game theory find a direct application. Our main result is that the AumannShapley value is both a coherent and practical approach to financial risk allocation.  
Keywords:  allocation of capital, coherent risk measure, riskadjusted performance measure; game theory, fuzzy games, Shapley value, AumannShapley prices.  
Date:  January 2001, (original version October 27, 1999)  
Type:  Published paper  
Reference:  Journal of Risk, vol. 4, no. 1 (2001), pages 134.  
Project:  Coherent Allocation of the Risk Capital  
Size:  39 pages  
Textfiles:  Postscript
(938
KBytes) Compressed Postscript (gzip, 246 KBytes) Portable Document Format (311 KBytes) 


A Methodology to Analyze Model Risk with an Application to Discount Bond Options in a HeathJarrowMorton Framework  
Authors:  Dr. Mireille Bossy
(INRIA
SophiaAntipolis, France) Prof. Dr. Rajna Gibson (Swiss Banking Institute, University of Zürich) FrançoisSerge Lhabitant (UBS AG) Prof. Nathalie Pistre (ENSAE, France) Dr. Denis Talay (INRIA SophiaAntipolis, France) 

Abstract:  In this paper, we propose a general methodology to analyse model risk for discount bond options within a unified Heath, Jarrow, Morton (1992) framework. We illustrate its applicability by focussing on the hedging of discount bond options and options portfolios. We show how to decompose the agent's "model risk" profit and loss, and emphasize the importance of the position's gamma in order to control it. We further provide mathematical results on the distribution of the forward profit and loss function for specific Markov univariate term structure models. Finally, we run numerical simulations for naked and combined option's hedging strategies in order to quantify the sensitivity of the forward profit and loss function with respect to the volatility of the forward rate curve, the shape of the term structure, and the characteristics of the position being hedged.  
Date:  June 2001 (First draft April 1998)  
Project:  Model Risk Management for Interest Rate Derivatives  
Size:  42 pages  
Textfiles:  Postscript
(1716 KBytes) Compressed Postscript (gzip, 447 KBytes) Portable Document Format (441 KBytes) 


Price Comparison Results and SuperReplication: An Application to Passport Options  
Author:  Dr. Victoria Henderson (RiskLab)  
Abstract:  In this paper, we provide a new proof of the result that option prices are increasing in volatility when the underlying is a diffusion process. This has been shown to hold for convex payoff, pathindependent options by El Karoui et al, Hobson amongst others. The advantage of the new proof is that it can be extended to establish monotonicity results for pathdependent payoffs where the payoff depends on the maximum (or minimum) of the asset price process. The techniques used to prove each of these results are mean comparison theorems of Hajek and coupling of stochastic processes.  
Using these results, and the connection between passport and lookback options, we prove that the price of a passport option is increasing in volatility for general diffusion models for the asset price. It is shown that the seller of a passport option can superreplicate if the volatility is overestimated, regardless of the strategy followed by the holder.  
Keywords:  stochastic volatility, passport option, comparison theorems, diffusions, coupling, pathdependent options  
Date:  May 22, 1999 (updated May 2, 2000)  
Type:  Published paper  
Reference:  Applied Stochastic Models in Business and Industry, Vol. 16, no. 4, Oct.Dec. 2000, p. 297310  
Size:  17 pages  
Textfiles:  Postscript
(605 KBytes) Compressed Postscript (gzip, 154 KBytes) Portable Document Format (173 KBytes) 

Related:  Passport Options with Stochastic Volatility  

Parameter Estimation and Bessel Processes in Financial Models  
Author:  Dr. Anja GöingJaeschke (RiskLab, Department of Mathematics,ETH Zürich)  
Abstract:  Multipurpose parameter estimation methods play an increasingly important role in financial as well as insurance mathematics. We study appropriate methods for estimating parameters in continuoustime processes based on discrete observations, resulting in consistent and asymptotically normal estimators. CoxIngersollRoss (CIR) processes for modeling shortterm interest rates are considered as a class of financial diffusion processes. Applying the estimation methods to simulated data of these processes, we show the numerical quality of the resulting estimates.  
Bessel processes emerge in many financial problems and have remarkable properties. Via a spacetime transformation they are related to CIR processes. In many financial applications, the calculation of the first time a diffusion process reaches a certain level is important, as for instance in the case of barrier options. We explicitly calculate the Laplace transform of the first time Bessel processes and related processes, such as CIR processes, hit a given barrier. As a generalization, we introduce Bessel processes allowing negative dimensions. For instance, geometric Brownian motion with negative drift is closely related to these processes. We show their properties and derive their transition densities. Time reversal is a crucial tool throughout the calculations and time reversed diffusions are investigated in detail.  
Date:  1998  
Projects:  Generalizations of Bessel Processes and Estimation in Financial Models  
Type:  Part of Diss. ETH No. 12566  
Size:  119 pages  
Textfiles:  Postscript
(3066 KBytes, two pages per sheet) Compressed Postscript (gzip, 808 KBytes, two pages per sheet) Portable Document Format (1022 KBytes) 


Interest Rate Model Risk: What are we Talking About?  
Authors:  Prof.
Dr. Rajna Gibson (Swiss Banking
Institute, University of Zürich) FrançoisSerge Lhabitant (UBS AG) Prof. Nathalie Pistre (ENSAE, France) Dr. Denis Talay (INRIA SophiaAntipolis, France) 

Abstract:  Recently, model risk became an increasing important concept in financial valuation, but also for risk management models and capital adequacy purposes. It arises as a consequence of incorrect modelling, model identification and specification errors, inadequate estimation procedures, as well as mathematical and statistical properties of financial models applied in imperfect financial markets. This paper provides a definition of model risk, identifies its possible origins, proposes a methodology to analyse and quantify model risk before finally proposing some illustrations of its consequences in the context of the valuation and risk management of interest rate contingent claims.  
Date:  May 1998  
Project:  Model Risk Management for Interest Rate Derivatives  
Reference:  Journal of Risk, vol. 1 (3), pp. 3762, 1999.  
Size:  34 pages  
Textfiles:  Postscript
(1446 KBytes) Compressed Postscript (gzip, 310 KBytes) Portable Document Format (427 KBytes) 


Market Risk Computation for Nonlinear Portfolios  
Author:  Gerold Studer (former member of RiskLab and IFOR, ETH Zürich)  
Abstract:  Maximum loss is introduced as a method for analyzing the market risk of portfolios by identifying the wort case in a given set of scenarios, called the trust region. After discussing some important relationships between maximum loss and valueatrisk, a technique for the maximum loss computation for quadratic functions is described. The repetitive application of the technique to growing trust regions leads to a sequence of worst case scenarios which form a complete path. The approximation of arbitrary profit and loss functions by a sequence of quadratic functions allows the efficient analysis of nonquadratic portfolios.  
Type:  Published paper  
Reference:  Journal of Risk, Vol. 1, No. 4, 1999  
Project:  Risk Aggregation Techniques for Complex Financial Portfolios  

Risk Measurement with Maximum Loss  
Author:  Gerold Studer (former member of RiskLab and IFOR, ETH Zürich)  
Abstract:  Effective risk management requires adequate risk measurement. A basic problem herein is the quantification of market risks: what is the overall effect on a portfolio if market rates change? First, a mathematical problem statement is given and the concept of Maximum Loss (ML) is introduced as a method for identifying the worst case in a given set of scenarios, called Trust Region. Next, a technique for calculating efficiently the Maximum Loss for quadratic functions is described; the algorithm is based on the LevenbergMarquardt theorem, which reduces the high dimensional optimization problem to a onedimensional root finding.  
Following this, the idea of the Maximum Loss Path is presented: repetitive calculation of ML for growing trust regions leads to a sequence of worst case scenarios, which form a complete path; similarly, the path of Maximum Profit (MP) can be determined. Finally, all these concepts are applied to nonquadratic portfolios: socalled Dynamic Approximations are used to replace arbitrary profit and loss functions by a sequence of quadratic functions, which can be handled with efficient solution procedures. A description of the overall algorithm rounds off the discussion of nonlinear portfolios.  
Key words:  Risk measurement, global optimization, quadratic programming, nonlinear programming, polynomialapproximation algorithm  
Type:  Published paper  
Reference:  Mathematical Methods of Operations Research, Vol. 50, No. 1, 1999, pp. 121134  
Project:  Risk Aggregation Techniques for Complex Financial Portfolios  

Quadratic Maximum Loss  
Authors:  Prof. HansJakob Lüthi and Gerold Studer (both IFOR, ETH Zürich)  
Abstract:  Regulatory authorities such as the BIS require institutions to use both valueatrisk models and stress testing. This paper aims to build a bridge between the two methods by examining the concept of maximum loss. In contrast to VAR, which depends on holding periods and confidence levels, ML has a supplementary degree of freedom known as a trust region. ML represents the worst case over such a trust region, and the calculation therefore identifies the worst case scenario for a portfolio.  
Date:  1997  
Type:  Published paper  
Reference:  In: VAR: Understanding and Applying ValueatRisk, Risk Publications, London, 1997, pp. 307316  
Project:  Risk Aggregation Techniques for Complex Financial Portfolios  

Maximum Loss for Risk Measurement of Portfolios  
Authors:  Prof. HansJakob Lüthi and Gerold Studer (both IFOR, ETH Zürich)  
Abstract:  Effective risk management requires adequate risk measurement. A basic problem herein is the quantification of market risk: what is the overall effect on a portfolio if the market rates change? First, a mathematical problem statement is given and the concept of Maximum Loss (ML) is introduced as a method for identifying the worst case in a given scenario space, called Trust Region. Next, a technique for calculating efficiently ML for quadratic functions is described; the algorithm is based on the LevenbergMarquardt theorem, which reduces the high dimensional optimization problem to a onedimensional root finding.  
Following this, the idea of the Maximum Loss Path is presented: repetitive calculation of ML for a growing trust region leads to a sequence of worst cases, which form a complete path. Similarly, the paths of Maximum Profit (MP) and Expected Value (EV) can be determined. The comparison of them permits judgements on the quality of portfolios. These concepts are also applicable to nonquadratic portfolios by using Dynamic Approximations, which replace arbitrary profit and loss functions by a sequence of quadratic functions.  
Finally, the idea of Maximum Loss Distribution is explained. The distributions of ML and MP can be obtained directly from the ML and MP paths. They lead to lower and upper bounds of the true profit and loss distributions about the spread of ML and MP.  
Date:  1996  
Type:  Published preceedings contribution  
Reference:  In: Operations Research Proceedings 1996, Springer, Berlin, 1997, pp. 386391  
Project:  Risk Aggregation Techniques for Complex Financial Portfolios  

Maximum Loss for Measurement of Market Risk  
Author:  Dr. Gerold Studer (IFOR, ETH Zürich)  
Abstract:  Effective risk management requires adequate risk measurement. A basic problem herein is the quantification of market risk: what is the overall effect on a portfolio's value if the market rates change? To answer this question, two fundamental difficulties have to be managed: first, market rates behave randomly and are correlated. Second, portfolio structures are highdimensional and typically nonlinear. The established risk measurement techniques can be divided into two categories. The purely stochastic approaches are based on the portfolio's profit and loss (P&L) distribution. The most popular method in this class is ValueatRisk (VaR), which typically represents the 1 or 5 percent quantile of the P&L distribution. The Maximum Loss (ML) methodology is a member of the second category, where risk is quantified by the value of some worst case scenario. Many of these worst case based measures examine a finite set of scenarios and do not take account of correlations (e.g. stress testing). The more elaborated methods determine the worst case scenario by solving constrained minimization problems, where the set of feasible scenarios is generally defined by the stochastic characteristics of the market rates. Compared to other worst case techniques, the Maximum Loss methodology uses a very particular choice of feasible domains: the socalled trust regions cover a certain percentage of all future outcomes and their shape reflects the correlation structure of the market rates. Furthermore, ML uses a polynomial time algorithm to identify the global minimum, whereas other techniques employ rudimentary optimization procedures. The fact that the Maximum Loss computation is based on a fast and reliable algorithm allows to solve the optimization problem repeatedly, which leads to new insights into the risks of nonlinear portfolios.  
Date:  1997  
Type:  Diss. ETH No. 12397  
Project:  Risk Aggregation Techniques for Complex Financial Portfolios  
Size:  112 pages  
Textfiles:  Postscript
(2910 KBytes) Compressed Postscript (gzip, 618 KBytes) Portable Document Format (847 KBytes) 


Approximation of P&L Distributions, Part II  
Authors:  Prof. Karl Frauendorfer, PierreYves Moix, Olivier Schmid, Institute of Operations Research, University of St. Gallen  
Abstract:  Former investigation (Approximation of ProfitandLoss Distributions, Part I introduces the application of the barycentric approximation methodology for evaluating profitandloss distributions numerically. Although, convergence of the quantiles is ensured by the weak convergence of the discrete measures, as proclaimed in Part I, recent numerical results have indicated that the approximations of the profitandloss distribution are less practical when the portfolio gets a reasonable complexity. This experience has revealed that the weak convergence of the probability measures appears not to be strong enough for evaluating quantiles numerically in a satisfactory way.  
Thereupon, the authors have focused on information offered by the barycentric approximation but still unused in the algorithmic procedure of Part I. It has been realized that the dual to the derived discrete probability measure helps evaluate the profitandloss distribution in a better way. In this Part II, the barycentric approximation technique is outlined and benchmarked with the intention to focus on the dual viewpoint for simplicial refinement. This technique poses no assumption on the risk factor space, except that the variancecovariance matrix of the risk factors exist. Therefore, it is applicable for general multivariate or empirical distributions. Furthermore, the technique provides approximation of the risk profile as well as of the risk factor distribution.  
Beforehand, various test environments are specified which help illustrate the sensitivity of valueatrisk numbers. These environments are characterized by the probability measure P of the risk factors and a risk profile g which represents the payoff structure of some portfolio. The corresponding numerical results illustrate the sensitivity of valueatrisk with respect to market volatility and correlation of risk factors. This provides information on the model risk one is exposed to within the valueatrisk approach.  
Date:  August 20, 1997  
Project:  Approximations of ProfitandLoss Distributions  
Textfiles:  Full Version: (67 pages) Postscript (4.69 MBytes) Compressed Postscript (gzip, 1.23 MBytes) Portable Document Format (748 KBytes) 

Management Version: (15 pages) Postscript (848 KBytes) Compressed Postscript (gzip, 252 KBytes) Portable Document Format (239 KBytes) 


Some Generalizations of Bessel Processes  
Author:  Anja Göing, Department of Mathematics,ETH Zürich  
Abstract:  Bessel processes are a oneparameter family of diffusion processes that appear in many financial problems and have remarkable properties. As for the importance of Bessel processes in financial mathematics, let us first mention their contribution to the problem of pricing Asian options with arithmetic asset average. Furthermore, for the CoxIngersollRoss model for interest rates, Bessel processes are playing an important role on which we will concentrate here. The CoxIngersollRoss processes are timespacetransformed squared Bessel processes. By using these relations we obtain results about first hitting times of CoxIngersollRoss processes. Bessel processes with negative dimensions are introduced. These processes arise quite naturally when the exponential of Brownian motion with a negative drift is considered. As an important tool we deeply investigate time reversal.  
Date:  April 1997  
Type:  RiskLab report  
Project:  Generalizations of Bessel Processes  
Size:  47 pages  
Textfiles:  Postscript (371 KBytes) Compressed Postscript (gzip, 145 KBytes) Portable Document Format (413 KBytes) 

Comment:  For revised and extended results, see also Parameter Estimation and Bessel Processes in Financial Models A Survey and Some Generalizations of Bessel Processes 


Factors at Risk  
Authors:  Prof. HansJakob Lüthi and Gerold Studer (both IFOR, ETH Zürich)  
Abstract:  The identification of scenarios which have a particularly low or high P&L helps to get a better understanding of the portfolio's risk exposure. Therefore, the notions of safe (resp. dangerous) regions are introduced, which represent sets where the P&L is greater (resp. less) than a given critical level. In order to describe such sets in an easily interpretable way, onedimensional intervals are used. Such intervals can be determined by solving a sequence of restricted maximum loss problems.  
Date:  January 1997  
Type:  RiskLab technical report  
Project:  Risk Aggregation Techniques for Complex Financial Portfolios  
Size:  10 pages  
Textfiles:  Postscript (201
KBytes) Compressed Postscript (gzip, 67 KBytes) Portable Document Format (187 KBytes) 


Quadratic Maximum Loss for Risk Measurement of Portfolios  
Authors:  Prof. HansJakob Lüthi, Gerold Studer, (both IFOR, ETH Zuuml;rich)  
Abstract:  Effective risk management requires adequate risk measurement. A basic problem herein is the quantification of market risks: what is the overall effect on a portfolio if market rates change? The first chapter gives a brief review of the standard risk measure ValueAtRisk (VAR) and introduces the concept of Maximum Loss (ML) as a method for identifying the worst case in a given scenario space, called Trust Region. Next, a technique for calculating efficiently ML for quadratic functions is described; the algorithm is based on the LevenbergMarquardt theorem, which reduces the high dimensional optimization problem to a one dimensional root finding.  
Following this, the idea of the Maximum Loss Path is presented: repetitive calculation of ML for a growing trust region leads to a sequence of worst cases, which form a complete path. Similarly, the paths of Maximum Profit (MP) and Expected Value (EV) can be determined; the comparison of them permits judgements on the quality of portfolios. These concepts are also applicable to nonquadratic portfolios by using Dynamic Approximations, which replace arbitrary profit and loss functions with a sequence of quadratic functions.  
Finally, the idea of Maximum Loss Distribution is explained. The distributions of ML and MP can be obtained directly from the ML and MP paths. They lead to lower and upper bounds of VAR and allow statements about the spread of ML and MP.  
Date:  September 1996  
Type:  RiskLab technical report  
Project:  Risk Aggregation Techniques for Complex Financial Portfolios  
Size:  31 pages  
Textfiles:  Postscript (1.29 MBytes) Compressed Postscript (gzip, 222 KBytes) Portable Document Format (431 KBytes) 


Estimation in Financial Models  
Author:  Anja Göing, Department of Mathematics,ETH Zürich  
Abstract:  Over the last few years various new derivative instruments have emerged in financial markets leading to a demand for versatile estimation methods for relevant model parameters. Typical examples include volatility, covariances and correlations. In this paper we give a survey on statistical estimation methods for both discrete as well as continuous time stochastic models.  
Date:  January 1996  
Type:  RiskLab report  
Project:  Estimation in Financial Models  
Size:  85 pages  
Textfiles:  Postscript (615 KBytes) Compressed Postscript (gzip, 227 KBytes) Portable Document Format (650 KBytes) 


Approximation of P&L Distributions  
Authors:  Prof.
Karl Frauendorfer, PierreYves
Moix and Olivier
Schmid (Institute of Operations Research, University of St. Gallen) 

Abstract:  Value functions (risk profiles) of financial instruments and the real distributions of risk factors are not available in analytically closed forms. These components have to be approximated. In this work, a new approach for risk measurement is introduced. The underlying methodology is based on the utilization of extremal measures for approximating the P&L distribution. A special class of "extremal measures" is employed which exploits the monotonicity of price sensitivities entailed by convexity. Clearly, in case the value functions have monotonous derivatives, the payofffunctions are convex or concave depending on whether a position is held short or long. The incorporated extremal measures provide approximations for both risk factor distribution and risk profiles, and allow for deriving an adequate approximation of the P&L distributions, in particular for appealing VaRestimates. The basics of this approach are presented and first numerical results are tested against the currently applied VaRapproaches and the simulation benchmarks established earlier in Allen.  
Date:  December 1995  
Type:  RiskLab report  
Project:  Approximations of ProfitandLoss Distributions  
Size:  22 pages  
Textfiles:  Postscript (438 KBytes) Compressed Postscript (gzip, 125 KBytes) Portable Document Format (291 KBytes) 


Value At Risk and Maximum Loss Optimization  
Author:  Gerold Studer (IFOR, ETH Zürich) 
Abstract:  A very condensed overview of risk measurement methods is given and the different techniques are classified. The risk measure Value At Risk (VAR) is presented from a new point of view and a general definition of VAR is derived. Next, Maximum Loss (ML) is formulated as a mathematical optimization problem and its modelling is described. 
The techniques for calculating ML for linear and quadratic risk profiles are presented. Some theoretical relations between VAR and ML are demonstrated: ML is presented as a general framework including DeltaNormal VAR as well as Wilson's DeltaGamma approach. It is also proven that ML is a worst case measure which is always more conservative than VAR.  
Date:  December 1995 
Type:  RiskLab technical report, revised version 
Project:  Risk Aggregation Techniques for Complex Financial Portfolios 
Size:  30 pages 
Textfiles:  Postscript (2 MBytes) Compressed Postscript (gzip, 249 KBytes) Portable Document Format (430 KBytes) 
Created and supported by Uwe Schmock
until September 2003. Please send comments and suggestions to Jörg Osterrieder/
Gallus Steiger
email: finance_update@math.ethz.ch. Last update: October 17, 2005 
© RiskLab, ETH Zürich.