simple example simulation of delta hedging with python

Here we will present simple python code of delta hedging example of a call option.
it’s a minimal example with zero interest rates , no dividends. On day 1 we sell 10 near ATM call options and start delta hedging i.e. buying/selling stock so that change in stock price neutralizes change in options value.The portfolio is then \pi= delta*Stock-callOption+cashAccount

to simulate stock prices we will use log-normal dynamics  \frac{dS_t}{S_t} =\sigma_t dW_t

We will also simulate implied volatility as log-normal  \frac{d\sigma_t}{\sigma_t}=\xi dB_t
each day of simulation we will store in DataFrame df , so it will be easy to print and plot with pandas library.

to run the python code you will need pandas library installed in your distribution

apart from usual NPV of hedging portfolio and values of call option and greeks we will also display P&L prediction which is  PNL \approx \frac{\Gamma}{2} (dS)^2 + \theta dt

delta hedging

delta hedging simulation

first 10 days of delta hedging:

       spot   vol  shares      cash  option   npv    vega  gamma  theta  pnlPredict   pnl  error
0    100.00 30.00    0.00      0.00    0.00  0.00    0.00   0.00   0.00        0.00   nan    nan
1    102.89 29.83    6.35   -401.20  252.38  0.00 -772.52  -0.06   0.12       -0.14  0.00   0.00
2    103.65 29.89    6.40   -406.13  257.56 -0.36 -774.36  -0.06   0.12        0.10 -0.36  -0.46
3     98.00 29.34    6.02   -368.79  218.14  2.91 -754.75  -0.07   0.11       -0.96  3.27   4.23
4     96.83 29.61    5.95   -362.24  213.06  0.93 -748.61  -0.07   0.11        0.06 -1.98  -2.04
5     95.76 29.35    5.86   -353.97  204.68  2.96 -743.77  -0.07   0.11        0.07  2.03   1.96
6     95.40 28.75    5.81   -348.68  198.03  7.53 -742.86  -0.07   0.11        0.10  4.57   4.46
7     94.12 28.71    5.71   -339.71  190.26  7.85 -735.94  -0.07   0.11        0.05  0.32   0.28
8     94.28 27.96    5.69   -336.97  185.56 13.47 -737.79  -0.07   0.10        0.10  5.62   5.52
9     95.42 27.56    5.75   -343.04  189.00 16.52 -744.17  -0.07   0.10        0.05  3.05   3.00
10    96.53 28.02    5.85   -353.01  198.72 13.16 -748.41  -0.07   0.11        0.06 -3.36  -3.42

P&L profile

running this code for number of days 10,100,1000,10000 we can see P&L profile start to look narrower and more centered at 0 with number of days , in the limit of infinite days (equivalent to infinite frequency of hedging of Black Scholes assumptions) it would look like a dirac function.

pnl profile delta hedging 10 100 1000 10000 days

pnl profile delta hedging 10 100 1000 10000 days

Why is it skewed to the left? as Gamma is always negative and high near ATM, if there’s jump down, PNL is proportional to gamma * jump size  PNL \approx \frac{\Gamma S^2}{2}((\frac{dS}{S})^2-\sigma^2) and delta makes a high decrease, so we would need to sell many shares cheap. In case there’s a jump up, we would still need to buy more stock (as delta of call option goes up), so still would loose money in this case.That makes the daily PNL profile asymmetric.

Theta

setting Spot prices constant in the code we can see the Theta effect on the hedging strategy:
here’s the example of running the python code with constant Spot:

theta OTM delta hedging constant spot

theta OTM delta hedging constant spot

volaility dynamics

here we will run the code first on constant volaility , and then on stochastic implied volaility:
we can see that pnl has much wider distribution in the latter case. this is due to the fact that black sholes assumption does not hold anymore (we would need to hedge with other options to get better pnl profile)

constant vs stochastic volatility

constant vs stochastic volatility pnl delta hedging

import numpy as np
import pandas as pd
pd.set_option('display.width', 320)
pd.set_option('display.max_rows', 100)
pd.options.display.float_format = '{:,.2f}'.format
from scipy.stats import norm
import matplotlib.pyplot as plt

def BlackScholes(tau, S, K, sigma):
    d1=np.log(S/K)/sigma/np.sqrt(tau)+0.5*sigma*np.sqrt(tau)
    d2=d1-sigma*np.sqrt(tau)
    npv=(S*norm.cdf(d1)-K*norm.cdf(d2))
    delta=norm.cdf(d1)
    gamma=norm.pdf(d1)/(S*sigma*np.sqrt(tau))
    vega=S*norm.pdf(d1)*np.sqrt(tau)
    theta=-.5*S*norm.pdf(d1)*sigma/np.sqrt(tau)
    return {'npv':npv,'delta':delta,'gamma':gamma,'vega':vega,'theta':theta}

class Call(object):
    def __init__(self,start,T,K,N):
        self.T=T
        self.K=K
        self.start=start  #day to sell   option
        self.N=N

    def calc(self,today,vol,S):
        if today<self.start:
            return {'delta':0,'npv':0,'vega':0,'gamma':0,'theta':0,'intrinsic':0}
        if today>self.T:
            return {'delta':0,'npv':0,'vega':0,'gamma':0,'theta':0,'intrinsic':0}
        if today==self.T:
            return {'delta':0,'npv':0,'vega':0,'gamma':0,'theta':0,'intrinsic':self.N*max(0,S-self.K)}
        tau=(self.T-today)/250.
        call=BlackScholes(tau, S, self.K, vol)
        return {'delta':self.N*call['delta'],'npv':self.N*call['npv'],'vega':self.N*call['vega'],'gamma':self.N*call['gamma'],'theta':self.N*call['theta'],'intrinsic':self.N*max(0,S-self.K)}

def deltahedge(Ndays,Sdynamics="S*=(1.0+vol*np.sqrt(dt)*np.random.randn())",volDynamics="vol=.30"):
    S=90.0
    Strike=100.0
    vol=.30
    columns=('spot','vol','shares','cash','option','npv','vega','gamma','theta','pnlPredict')
    df = pd.DataFrame([[S,vol,0,0,0,0,0,0,0,0]],columns=columns)
    dt=1/250.
    cash=0
    dayToSellCall=1
    maturityCall=Ndays-1
    call=Call(dayToSellCall,maturityCall,Strike,-10)# sell one call on dayToSellCall day
    for day in xrange(1,Ndays+1):
        exec Sdynamics
        exec volDynamics

        if day==dayToSellCall: #sell call
          callValue=call.calc(day,vol,S)
          cash-=callValue['npv']

        #delta hedge
        callValue=call.calc(day,vol,S)
        delta=callValue['delta']
        currentNumberShares=df.iloc[day-1].shares
        sharesBuy=-currentNumberShares-delta
        cash-=sharesBuy*S
        if day==maturityCall:
            cash+=call.calc(day,vol,S)['intrinsic'] #settle call

        gamma=callValue['gamma']
        theta=callValue['theta']
        dS=S-df.iloc[day-1].spot
        pnlPredict=0.5*gamma*dS*dS+theta*dt
        dfnew=pd.DataFrame([[S,vol,-delta,cash,-callValue['npv'],cash+callValue['npv']-delta*S,callValue['vega'],gamma,theta/250.,pnlPredict]],columns=columns)
        df=df.append(dfnew,ignore_index=True)

    df['pnl'] = df['npv'] - df['npv'].shift(1)
    df['vol']=100.0*df['vol']
    df['error']=df['pnl']-df['pnlPredict']
    df.set_value(dayToSellCall, "error", 0)
    #df.loc[:,['vol','spot']].plot(title='Spot and implied Volatility')
    df.loc[:,['npv','spot','option']].plot(title='-Call+delta*S+cash vs Spot {0} {1}'.format(Sdynamics,volDynamics))
    df.loc[:,['theta']].plot(title='Theta {0} {1}'.format(Sdynamics,volDynamics))
    df.loc[:,['pnl']].hist(bins=50)
    #df.loc[:,['error']].hist(bins=50)
    print df.loc[:,['pnl']].describe()
    #print df

deltahedge(1000)#constant vol
deltahedge(1000,volDynamics="vol*=(1.0+0.5*np.sqrt(dt)*np.random.randn())")#stochastic vol
deltahedge(1000,Sdynamics="S=90") #consant OTM
deltahedge(10000,Sdynamics="S=100") #consant ATM
deltahedge(10000,Sdynamics="S=110")#consant ITM
deltahedge(10000,Sdynamics="S+=S*dt") #growing stock
plt.show()
Posted in OTC derivatives valuation, quant finance coding course

python and derivatives pricing tutorial – brownian motion

In pricing models (black-scholes, local vol, libor market model) the source of uncertainty is Brownian motion. how to simulate it in python? python download

first, lets simulate dWt , we know it’s a gaussian variable with mean 0 and deviation  \sqrt {dt}
google “python gaussian” and we get to numpy gaussian
so dWt is just np.random.randn()*math.sqrt(dt)

knowing dWt how to we get to Wt at time t?  W_T=\int_0^T {dW_t} and integral is just a sum , so  W_T = \sum {dW_t} , so final function to calculate W_T is:

 
import numpy as np
dt=0.01
def W(T):
  sum=0
  N=T/dt
  for i in xrange(int(N)):
    sum+=np.random.randn()*np.sqrt(dt)
  return sum

now check calculate properties of brownian motion:

1) mean should be zero

 
T=1.0
nSim=10000
mean=0
for i in xrange(nSim):
  mean+=W(T)
mean/=nSim  
print mean

2) variance should be T

 
T=3.0
nSim=10000
mean=0
for i in xrange(nSim):
  wt=W(T)
  mean+=wt*wt
mean/=nSim  
print mean

now lets calculate probability  P( W(1)<0 and W(2) < 0)

nSim=100000
proba=0
for i in xrange(nSim):
  w1=W(1)
  w2=w1+W(1)
  if w1<0 and w2<0:
    proba+=1.
proba/=nSim  
print proba
Posted in quant finance coding course

python and derivatives pricing tutorial

Tutorial objective: write and understand simple minimal programs in python for pricing financial derivatives

topics:

Brownian motion
objective: draw and calculate properties of brownian motion using python

Black scholes pricing
objective: calculate call option price

Heston model
objective: draw forward smile as function of parameters

Libor Market model

objective: implement minimal libor market model

Markov functional model
objective: implement minimal libor markov functional model

Local Volatility model
objective: implement simplest local vol model

PDE Black Scholes
objective: implement simple PDE pricer

all topics are divided into 3 parts:
Maths,python,exercises for extension of topic

Posted in OTC derivatives valuation, quant finance coding course Tagged with: , ,

How to prepare for quantitative finance intervew

FreshPaint-0-2016.04.11-09.12.29

If you apply for quant analyst/quant developer job at investment bank/ hedge fund your quantitative finance interview will generally consist of 4 parts:

  • Programming (C++,python,data structures)
  • General probability/calculus questions
  • Stochastic calculus
  • Derivatives pricing questions for asset class (equity derivatives,interest rate derivatives,credit derivatives)

Programming

usually 2 types of questions are asked:
– data structures questions (ex. how to construct queue using only 2 stacks,how to sort array)
– specific programming questions , usually it will be C++ or Python.

best ways to prepare for programming part interview is to write small programs and debug them to see the order of execution and variable’s content. this could be very useful (copy paste programs from tutorials to your IDE and debug them)

C++ questions usually would focus on your understanding of virtual functions mechanism,shared pointers and inheritance. so the best way to prepare is to write several minimal programs using derived and base class, and then debug the code to see what constuctor, destructor or member functions get executed.
also get familiar to const , mutable , auto keywords. typical question would be to implement shared pointer from scratch – look here for interview-style implementation

Python questions would usually include pythonic ways to do simple summations on dictionaries , yield keyword and closures. again, best way to prepare is to write minimal programs of different ways to sort array, traverse tree and debug them.

data structures questions would need from you understanding of common data structures (stack, queue, tree) and basic algorithms to work with them. one of the best resourses on this topic is coursera princeton course on algorithms and data structures

questions on data structures usually would involve deriving most optimal method in terms of speed , get familiar with O(N) notation and speed of common algorithms (quicksort, bubble sort, merging of arrays)

Probability/calculus questions

get to know well Bayes theorem and solve few dozens of exercises involving it.Calculus questions usually are about calculating integrals and solving simple differential equations.
a lot of questions would involve calculating conditional expectation, so prepare by solving few dozens exercises on discrete and continuous version of conditional expectation.

Stochastic calculus

these questions would need you to demonstrate good knowledge of Ito lemma and martingale theory,remember by heart single and 2-dimensional Ito formula, and then derive differentials for common processes like W^2 , W^3 ln(S_t) exp(S_t) where S_t is black-scholes diffusion.

on questions involving same brownian motion at different times , remember that these points are jointly gaussian and use conditional expectation and variance results

Derivatives pricing questions

even if you interview for interest rate quant positions chances are you would still need to know well equity derivatives , as most questions are easier formulated using stock prices.
you would need to be confident deriving Black scholes equation using PDE and Expectation / Change of measure method. tipical questions would include calculating prices of Call,binanry options in black sholes and binomial tree model.get familiar with Local Volatility and Stochastic volatility heston model, some questions would involve qualitative price differences of , say , barrier options using both models (main thing to remember here is that in local volatility case forward skew is flatter than in stochastic volatility case), get familair with lookback option hedging argument by Gatheral (google “gatheral lookback”).

For Interest rate derivatives Girsanov theorem would invariably come in the interviews so learn how to derive Libor in arrears and CMS convexity adjustment using change of measure. Bermudan swaption pricing in one factor or libor market models can come up, so learn how to qualitativley estimate its price based on mean reversion/correlation.

for credit positions its useful to know how to derive NPV of floating and fixed leg of CDS, and poisson process characteristics.implementation of Credit Valuation Adjustment is discussed here :

Summary

Review your quant finance courses to understand basic concepts very well.debug line by line common implementations of algorithms, and practice on your phone using our app: Quant Finance Test. if you can solve all of these questions in under 5 minutes you are good to go.

Posted in OTC derivatives valuation Tagged with: , , ,

PCA simple example (principle component analisys)

simple examples of what PCA does with 2D set of points:
ellipsoid like set of points:

pca example

pca example

upper graph is original set of points
red vectors are PCA components scaled by explained variance
buttom graph is transformed set of points

so one can see that PCA will

1) move center of axis to mean of points
2) find axis with most variance.set second axis will be perpendicular to first one
3) project points to new axis

results with small outlier: (outlier make PCA less useful)

pca example outlier

pca example outlier

pca simple example python code to play with PCA:

[copy paste it to ipython enviroment with sklearn-kit installed]

press left mouse button to add points to 2D set
press right mouse button to generate PCA of points

from Tkinter import *
import numpy as np
from sklearn.decomposition import PCA

points=np.zeros((1,2))

def leftbutton(event):
    global points
    event.widget.create_oval(event.x,event.y,event.x+5 ,event.y+5,fill="red" )
    points[0][0]=event.x
    points[0][1]=event.y
    points=np.append(points,[[event.x,event.y]],axis=0)

def rightbutton(event):
    pca = PCA()
    pca.fit(points)
    traint=pca.transform(points)
    scatter(points[:,0],points[:,1],alpha=0.5)
    pmean=np.mean(points,0)
    axis('equal')
    plot([pmean[0]],[ pmean[1]], 'g.', markersize=10.0)
    S=sqrt(pca.explained_variance_)
    plot([pmean[0],pmean[0]+S[0]*pca.components_[0][0]],[pmean[1],pmean[1]+S[0]*pca.components_[0][1]],'r-',lw=2)
    plot([pmean[0],pmean[0]+S[1]*pca.components_[1][0]],[pmean[1],pmean[1]+S[1]*pca.components_[1][1]],'r-',lw=2)
    show()
    scatter(traint[:,0],traint[:,1],alpha=0.5)
    show()
    print "pca vector=",pca.components_
    print "0,0->",pca.inverse_transform([0,0])

root = Tk()
cnv = Canvas(root, bg="white", width=500, height= 500)
cnv.pack()
cnv.bind("<Button-1>", leftbutton)
cnv.bind("<Button-3>", rightbutton)
root.mainloop()
Posted in math Tagged with: ,

Libor in arrears convexity adjustment simple example

Libor in arrears

Libor in arrears

simplified Libor in arrears payoff: pay at time 1 1-year Libor reset at time 1 F(1)

 \frac{NPV(0)}{P_1(0)}=\mathbb{E}^1 (\frac{F(1)}{P_1(1)})

where \mathbb{E}^1 is measure with numeraire P_1(t)
change measure from time 0 to time 1 (time while F(t) is changing)
with girsanov formula :

 \frac{P_1(0)}{P_1(1)}dP_1=\frac{P_2(0)}{P_2(1)}dP_2

so we get

 NPV(0) = P_2(0) E^2 ( \frac{F(1)}{P_2(1)})

but  \frac{1}{P_2(1)}=1+F(1)

so  NPV(0)=P_2(0)\mathbb{E}^2(F(1))+P_2(0)\mathbb{E}^2(F^2(1))

under \mathbb{E}^2 F(1) is martingale i.e. \mathbb{E}^2 F(1)= F(0)

to calculate \mathbb{E}^2(F^2(1)) we must introduce dynamics for F(1)

for example black-scholes where under \mathbb{E}^2 : dF(t)=\sigma F(t) dW_t

so  F(1)=F(0) e^{\sigma W_1 - 0.5 \sigma^2 } and therefore

 \mathbb{E}^2 F^2(1)=F^2(0) e^{-\sigma^2 } \mathbb{E}^2 e^{2 \sigma W_1 } = F^2(0) e^{\sigma^2 }

for CMS convexity adjustment use linear model i.e.  CVA = (1-R) \int{DF(t) EE(t) dQ_t}

where R is Recovery (normally set to 40%) EE(t) expected exposure at time t and dQ survival probability density , DF(t) discount factor at time t

Outline

1) In this simple example we will use Hull White model to generate future yield curves. In practice many banks use some yield curve evolution models based on historical simulations.
In Hull White model short rate  r_t is distributed normally with known mean and variance.

2) For each point of time we will generate whole yield curve based on short rate. Then we will price our interest rate swap on each of these curves.
after

3) to approximate CVA we will use BASEL III formula for regulatory capital charge approximating default probability [or survival probability ] as exp(-ST/(1-R))
so we get
 CVA= (1-R) \sum  \frac{EE^{*}_{T_{i}}+EE^{*}_{T_{i-1}}}{2} (e^{-\frac{ST_{i-1}}{1-R}}-e^{-\frac{ST_{i}}{1-R}})^+
where  EE^{*} is discounted Expected exposure of portfolio

Details

For this first example we’ll take 5% flat forward yield curve. In the second test example below we’ll take full curve as of 26/12/2013.

1) Hull-White model for future yield curve simulations

the model is given by dynamics:
 dr_t=(\theta_t-ar_t)dt+\sigma dW_t
We will use that in Hull White model short rate  r_t is distributed normally with mean and variance given by
 E(r_t|r_s)=r_se^{-a(t-s)}+\gamma_t-\gamma_se^{-a(t-s)}
 Var(r_t|r_s)=\frac{\sigma^2}{2a}(1-e^{-2a(t-s)})
where  \gamma_t=f_t(0)+\frac{\sigma^2}{2a}(1-e^{-at})^2
and  f_t(0) is instantaneous forward rate at time t as seen at time 0.
The calculations will not depend on \theta_t.
To generate future values of  r_t first we will generate matrix of standard normal variables using numpy library function numpy.random.standard_normal().After that we will apply transform  mean+\sqrt{variance} Normal(0,1) to get  r_t
after getting matrix of all  r_t for all simulations and time points we will construct yield curve for each  r_t using Hull White discount factors  E(r_{t_i}) for each future date

 P_{\rightarrow T}(0)= E( e^{-\int_0^Tr_sds})=E( E(e^{-\int_0^Tr_sds}|F_{t_i}))=E(P_{\rightarrow t_i}(0)E(e^{-\int_{t_i}^Tr_sds}|F_{t_i})) for each future time point  t_i T is maturity

also we'll output some generated yield curves.

from  QuantLib import *
import numpy as np
from math import *
from pylab import *

def A(t,T):
    evaldate=Settings.instance().evaluationDate
    forward = crvToday.forwardRate(t, t,Continuous, NoFrequency).rate()
    value = B(t,T)*forward - 0.25*sigma*B(t,T)*sigma*B(t,T)*B(0.0,2.0*t);
    return exp(value)*crvToday.discount(T)/crvToday.discount(t);

def B(t,T):
    return (1.0-exp(-a*(T-t)))/a;

def gamma(t):
        forwardRate =crvToday.forwardRate(t, t, Continuous, NoFrequency).rate()
        temp = sigma*(1.0 - exp(-a*t))/a
        return (forwardRate + 0.5*temp*temp)

def gamma_v(t): #vectorized version of gamma(t)
    res=np.zeros(len(t))
    for i in xrange(len(t)):
        res[i]=gamma(t[i])
    return res

Nsim=1000
#a=0.03
#sigma=0.00743
#parameters calibrated with Quantlib to coterminal swaptions on 26/dec/2013
a=0.376739
sigma=0.0209835

todaysDate=Date(26,12,2013);
Settings.instance().evaluationDate=todaysDate;
crvTodaydates=[Date(26,12,2013),
               Date(30,6,2014),
               Date(30,7,2014),
               Date(29,8,2014),
               Date(30,9,2014),
               Date(30,10,2014),
               Date(28,11,2014),
               Date(30,12,2014),
               Date(30,1,2015),
               Date(27,2,2015),
               Date(30,3,2015),
               Date(30,4,2015),
               Date(29,5,2015),
               Date(30,6,2015),
               Date(30,12,2015),
               Date(30,12,2016),
               Date(29,12,2017),
               Date(31,12,2018),
               Date(30,12,2019),
               Date(30,12,2020),
               Date(30,12,2021),
               Date(30,12,2022),
               Date(29,12,2023),
               Date(30,12,2024),
               Date(30,12,2025),
               Date(29,12,2028),
               Date(30,12,2033),
               Date(30,12,2038),
               Date(30,12,2043),
               Date(30,12,2048),
               Date(30,12,2053),
               Date(30,12,2058),
               Date(31,12,2063)]
crvTodaydf=[1.0,
            0.998022,
            0.99771,
            0.99739,
            0.997017,
            0.996671,
            0.996337,
            0.995921,
            0.995522,
            0.995157,
            0.994706,
            0.994248,
            0.993805,
            0.993285,
            0.989614,
            0.978541,
            0.961973,
            0.940868,
            0.916831,
            0.890805,
            0.863413,
            0.834987,
            0.807111,
            0.778332,
            0.750525,
            0.674707,
            0.575192,
            0.501258,
            0.44131,
            0.384733,
            0.340425,
            0.294694,
            0.260792
            ]

crvToday=DiscountCurve(crvTodaydates,crvTodaydf,Actual360(),TARGET())
#crvToday=FlatForward(todaysDate,0.0121,Actual360())

r0=forwardRate =crvToday.forwardRate(0,0, Continuous, NoFrequency).rate()
months=range(3,12*5+1,3)
sPeriods=[str(month)+"m" for month in months]
print sPeriods
Dates=[todaysDate]+[todaysDate+Period(s) for s in sPeriods]
T=[0]+[Actual360().yearFraction(todaysDate,Dates[i]) for i in xrange(1,len(Dates))]
T=np.array(T)
rmean=r0*np.exp(-a*T)+ gamma_v(T) -gamma(0)*np.exp(-a*T)
rvar=sigma*sigma/2.0/a*(1.0-np.exp(-2.0*a*T))
rstd=np.sqrt(rvar)
np.random.seed(1)
stdnorm = np.random.standard_normal(size=(Nsim,len(T)-1))

rmat=np.zeros(shape=(Nsim,len(T)))
rmat[:,0]=r0
for iSim in xrange(Nsim):
    for iT in xrange(1,len(T)):
        mean=rmat[iSim,iT-1]*exp(-a*(T[iT]-T[iT-1]))+gamma(T[iT])-gamma(T[iT-1])*exp(-a*(T[iT]-T[iT-1]))
        var=0.5*sigma*sigma/a*(1-exp(-2*a*(T[iT]-T[iT-1])))
        rnew=mean+stdnorm[iSim,iT-1]*sqrt(var)
        #if (rnew<0): rnew=0.001
        rmat[iSim,iT]=rnew

bonds=np.zeros(shape=rmat.shape)

#E(E(exp(rt)|ti) test
for iSim in xrange(Nsim):
    for iT in xrange(1,len(T)):
        bonds[iSim,iT]=bonds[iSim,iT-1]+rmat[iSim,iT]*(T[iT]-T[iT-1])

bonds=-bonds;
bonds=np.exp(bonds)

bondsmean=np.mean(bonds,axis=0)
#plot(T,bondsmean)
#plot(T,[crvToday.discount(T[iT]) for iT in xrange(len(T))])
#show()

startDate=Date(26,12,2013);

crvMat= [ [ 0 for i in xrange(len(T)) ] for iSim in range(Nsim) ]
npvMat= [ [ 0 for i in xrange(len(T)) ] for iSim in range(Nsim) ]

for row in crvMat:
    row[0]=crvToday

for iT in xrange(1,len(T)):
    for iSim in xrange(Nsim):
        crvDate=Dates[iT];
        crvDates=[crvDate]+[crvDate+Period(k,Years) for k in xrange(1,21)]
        rt=rmat[iSim,iT]
        #if (rt<0): rt=0
        crvDiscounts=[1.0]+[A(T[iT],T[iT]+k)*exp(-B(T[iT],T[iT]+k)*rt) for k in xrange(1,21)]
        crvMat[iSim][iT]=DiscountCurve(crvDates,crvDiscounts,Actual360(),TARGET())

bondT=np.zeros(shape=rmat.shape)
for iSim in xrange(Nsim):
    for iT in xrange(len(T)):
        bondT[iSim,iT]=bonds[iSim,iT]*crvMat[iSim][iT].discount(19-T[iT])

bondTmean=np.mean(bondT,axis=0)
np.set_printoptions(precision=4,suppress=True)
print 'bondTmean-Terminal bond\n',bondTmean-crvToday.discount(19)


plot(range(10),[crvMat[0][0].forwardRate(k, k,Continuous, NoFrequency).rate() for k in range(10)])
for i in xrange(min(Nsim,10)):
    plot(range(10),[crvMat[i][1].forwardRate(k, k,Continuous, NoFrequency).rate() for k in range(10)])
title('generated yield curves')
show()

#indexes definitions
forecastTermStructure = RelinkableYieldTermStructureHandle()
index = Euribor(Period("6m"),forecastTermStructure)

#swap 1 definition
maturity = Date(26,12,2018);
fixedSchedule = Schedule(startDate, maturity,Period("6m"), TARGET(),ModifiedFollowing,ModifiedFollowing,DateGeneration.Forward, False)
floatingSchedule = Schedule(startDate, maturity,Period("6m"),TARGET() ,ModifiedFollowing,ModifiedFollowing,DateGeneration.Forward, False)
swap1 = VanillaSwap(VanillaSwap.Receiver, 10000000,fixedSchedule,0.02, Actual360(),floatingSchedule, index, 0,Actual360())  #0.01215

for iT in xrange(len(T)):
    Settings.instance().evaluationDate=Dates[iT]
    allDates= list(floatingSchedule)
    fixingdates=[index.fixingDate(floatingSchedule[iDate]) for iDate in xrange(len(allDates)) if index.fixingDate(floatingSchedule[iDate])<=Dates[iT]]
    if fixingdates:
        for date in fixingdates[:-1]:
            try:index.addFixing(date,0.0)
            except:pass
        try:index.addFixing(fixingdates[-1],rmean[iT])
        except:pass
    discountTermStructure = RelinkableYieldTermStructureHandle()
    swapEngine = DiscountingSwapEngine(discountTermStructure)
    swap1.setPricingEngine(swapEngine)

    for iSim in xrange(Nsim):
        crv=crvMat[iSim][iT]
        discountTermStructure.linkTo(crv)
        forecastTermStructure.linkTo(crv)
        npvMat[iSim][iT]=swap1.NPV()

npvMat=np.array(npvMat)
npv=npvMat[0,0]
#replace negative values with 0
npvMat[npvMat<0]=0
EE=np.mean(npvMat,axis=0)
print '\nEE:\n',EE
#print '\nrmat:\n',rmat
print '\nrmean:\n',rmean
#print '\nrstd:\n',rstd
#print '\n95% are in \n',zip(rmean-2*rstd,rmean+2*rstd)


S=0.05
R=0.4
sum=0
for i in xrange(len(T)-1):
    sum=sum+0.5*crvToday.discount(T[i+1])*(EE[i]+EE[i+1])*(exp(-S*T[i]/(1.0-R))-exp(-S*T[i+1]/(1.0-R)))
CVA=(1.0-R)*sum
print "\nnpv=",npv
print "\nCVA=",CVA


plot(T,EE)
title('Expected Exposure')
xlabel('Time in years')
#plot(T,np.mean(rmat,axis=0))
#plot(T,rmean)
#plot(T,[npvMat[0,0]]*len(T))
show()
print "\nnpv",npvMat

output:

credit value adjustment Expected exposure python output

credit value adjustment
Expected exposure

bloomberg CVA = 35266

NB: bloomberg's Exposure is our exposure multiplied by (1-R)

CVA swap

CVA swap

Posted in OTC derivatives valuation, quantlib Tagged with: , , , , , , , , , ,