I always wanted to apply the previously
mentioned in this blog risk optimization techniques to a market that is not extensively researched from risk perspective. Romanian stock market, for instance. Now, that I have some
time (and hopefully I will have time to follow the development of the model
portfolio) I am providing the results. Previous posts on this blog have already
outlined steps and underlying idea about risk optimization.
So, directly to implementation: I use a short period of time: Jan 21, 2015 –
March 21, 2016 (I usually tend to work with longer period but for this post I
decided to narrow the sample). Raw stock prices are updated for stock dividends
but also for cash dividends (i.e. prices before the ex-dividend days are adjusted)
so I am avoiding breaks in price series data.
Results from optimization:
One-day 99% VaR:
Portfolio VaR (equal weighted): RON20356.59
Optimal VaR (optimized weights): RON18146.25
First, I am standardizing daily returns (by
subtracting the mean from the daily return and then normalizing the result by
the standard deviation) to yield number of standard deviations from the
mean, i.e. showing how “unusual” are the daily returns during the analysed
period.
In terms of Python code:
import
pandas as pd
import
numpy as np
from
numpy import *
import
matplotlib.pyplot as plt
import
scipy.optimize as scopt
from
scipy.stats import norm
%matplotlib
inline
Value=1e6
# RON1,000,000
CI=0.99
# set the confidence interval
data_all=pd.read_csv('RO.csv',
delimiter=',') #database contains also a trading day
data=data_all.ix[:,1:14]
#here I exclude the day to calculate returns
tickers
=['SNG', 'SNP', 'TGN', 'TEL', 'BRD', 'TLV', 'FP', 'EL', 'SIF1', 'SIF2', 'SIF3',
'SIF4', 'SIF5']
numbers=len(tickers)
ret=data/data.shift(1)-1
# calculate the simple returns
ret=ret[1:] # remove first row since it contanins NA
covariances=ret.cov()
#gives the covariance of returns
variances=np.diag(covariances)
#extracts variances of the individual shares from covariance matrix
volatility=np.sqrt(variances)
#gives standard deviation
annualized_ret=ret.mean()*252
annualized_vol=ret.std()*sqrt(252)
#anualized volatility
weights
= np.ones(data.columns.size)/data.columns.size #starting point is
equal weighted portfolio
weights
# 1. Calculate the standardized returns
standardized=(ret-mean(ret))/std(ret)
#standardized returns
day=pd.DataFrame(data_all['Day'][1:])
#takes the trading day from the database
data2=pd.DataFrame.join(day,
standardized)
# We can
plot the standardized returns, for instance here is the chart for Romgaz (SNG):
data2.plot('Day','SNG')
plt.xlabel('Day')
plt.title('Romgaz')
plt.xticks(rotation=20)
plt.ylabel('number of
standard deviations from average')
# 2. Now
heading for the real exercise
Pf_variance=np.dot(weights.T,np.dot(ret.cov(),weights))
#Portfolio variance
Pf_volatility=np.sqrt(Pf_variance)
#Portfolio standard deviation
Moneyvariance=np.square(Value)*Pf_variance
Moneyvolatility=np.sqrt(Moneyvariance)
covariance_asset_portfolio=np.dot(weights.T,covariances)
cov_money=np.multiply(covariance_asset_portfolio,Value)
beta=np.divide(covariance_asset_portfolio,Pf_variance)
def VaR():
# this code calculates Portfolio
Value-at-risk (Pf_VaR) in RON-terms and Individual Value-at-risk
(IndividualVaR) of shares in portfolio.
Pf_VaR=norm.ppf(CI)*Moneyvolatility
IndividualVaR=np.multiply(volatility,Value*weights)*norm.ppf(CI)
IndividualVaR = [ 'RON%.2f' % elem for elem
in IndividualVaR ]
print ('Portfolio VaR: ', 'RON%0.2f'
%Pf_VaR)
print ('Individual VaR: ',[[tickers[i],
IndividualVaR[i]] for i in range (min(len(tickers), len(IndividualVaR)))])
VaR()
def
marginal_component_VaR():
#
this code calculates Marginal Value-at-risk in RON-terms and Component
Value-at-risk of shares in portfolio.
marginalVaR=np.divide(cov_money,Moneyvolatility)*norm.ppf(CI)
componentVaR=np.multiply(weights,beta)*Moneyvolatility*norm.ppf(CI)
marginalVaR = [ '%.3f' % elem for elem in
marginalVaR ]
componentVaR=[ 'RON%.2f' % elem for elem in
componentVaR ]
print ('Marginal VaR:', [[tickers[i],
marginalVaR[i]] for i in range (min(len(tickers), len(marginalVaR)))])
print ('Component VaR: ', [[tickers[i],
componentVaR[i]] for i in range (min(len(tickers), len(componentVaR)))])
marginal_component_VaR()
def VaR(weights):
return
norm.ppf(CI)*np.sqrt(np.square(Value)*(np.dot(weights.T,np.dot(ret.cov(),weights))))
def minVaR(VaR):
nstocks = data.columns.size
bounds = [(0.,1.) for i in
np.arange(nstocks)] #non-negative weights
constraints = ({'type': 'eq',
'fun': lambda weights:
np.sum(weights) - 1}) #sum of weights to equal 100%
results = scopt.minimize(VaR, weights,
method='SLSQP',
constraints =
constraints,
bounds =
bounds)
result_weights=np.round(results.x, 4)
result_weights= [ '%.8f' % elem for elem in
result_weights ]
new_VaR=VaR(results['x'])
print ('Optimal VaR: ', new_VaR)
print ('Optimal Weights: ', [[tickers[i],
result_weights[i]] for i in range (len(tickers))])
minVaR(VaR)
beta
No comments:
Post a Comment