Volatility Forecasting – A Performance Measure of Garch Techniques With Different Distribution Models

Volatility Forecasting is an interesting challenging topicin current financial instruments as it is directly associated with profits. There are many risks and rewards directly associated with volatility. Hence forecasting volatility becomes most dispensable topic in finance. The GARCH distributions play an important role in the risk measurement and option pricing. In this paper the motive is to measure the performance of GARCH techniques for forecasting volatility by using different distribution model. We have used 9 variations in distribution models that are used to forecast the volatility of a stock entity. The different GARCH distribution models observed in this paper are Std, Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU. Volatility is forecasted for 10 days in advance and values are compared with the actual values to find out the best distribution model for volatility forecast. From the results obtain it has been observed that GARCH with GED distribution models has outperformed all models.


INTRODUCTION
Volatility plays a key role in finance it is responsible for option pricing and risk management. Volatility is directly associated with risks and returns, higher the volatility the more financial market is unstable. It may result in both High profits or huge loses if volatility is changing at higher rate. Volatility directly or indirectly controls asset return series, equity prices and foreign exchange rates. If the pattern of volatility clusters is studied for longer duration we observe that, once if volatility reaches its highest point then it will continue for a longer duration. These are readily recognized by Generalized Autoregressive Conditional Heteroscedasticity (GARCH) model introduced by Bollerslev [1986]. The volatility models identify and track the volatility clusters that are reaching either higher peaks or lower peaks by modeling the volatility clusters. In every period, the arrival of cluster is demonstrated as another advancement term with fluctuation scaled up by the data of profits and volatilities in the past periods. While considering the volatility dynamic with standout lagged period, the GARCH (1, 1) model has turned into a workhorse in both scholarly and practice because of its effortlessness and instinctive understanding.
While applying GARCH models in monetary danger administration, the conveyance of GARCH developments assumes a critical part. From the meaning of GARCH model, it is clear that the restrictive circulation of future returns has the same shape as the appropriation of the advancements. Subsequently, an unseemly model on the appropriation of advancements might prompt either underestimation or overestimation of future dangers. Furthermore, diverse appropriations of GARCH advancements might likewise prompt distinctive choice estimating results. This paper looks at a current analysis structure on the dispersion of GARCH advancements, what's more, exhibits its downside when applying to money related time arrangement. Further, we add to an option technique, especially towards applications to money related time arrangement.
The recent work carried in the field of finance using Garch techniques are discussed in this section. Francesco Audrino [2016][1] discuss about Volatility Forecasting on SP 500 data set considering Downside Risk, Jumps and Leverage Effect. The paper forecast the leverage effect separated into continuous and discontinuous effects, and past volatility are separated into good and bad leverages. Momtchil Dojarliev [2014][2] researched on the volatility and value risk evaluation for MSCI North American Index, the paper compares techniques such as Naïve, GARCH, AGARCH and BEKK model in forecasting volatility. Out of all the techniques Naïve has the highest failure rate and BEKK model has highest successful rate. Karunanithy Banumathy [2015] [3] in their research work modeling in Stock Market volatility using GARCH, Akaike Information Criterion (AIC) and Schwarz Information Criterion (SIC), the study proves that GARCH and TGARCH estimations are found to be most appropriate model to capture the symmetric and asymmetric volatility respectively.
Amadeus Wennström [2014][4] research on volatility forecasting and their performance of 6 generally used forecasting models; the simple moving average, the exponentially weighted moving average, the ARCH model, the GARCH model, the EGARCH model and the GJR-GARCH model. The dataset used in this report are three different Nordic equity indices, OMXS30, OMXH25 and OMXC20. The result of this research work suggests that EGARCH has better MSE (Mean Square Error) rates compared to other techniques. Yiannis Dendramis [2012] [5] measure performance of option parametric instability models, as EGARCH or GARCH models, can be extensively enhanced in the event that they are joined with skewed conveyances of return innovations. The execution of these models is observed to be like that of the EVT (compelling esteem hypothesis) methodology and it is superior to that of their expansions taking into account Markov administration exchanging effects with or without EGARCH effects. The paper …recommends that the execution of the last approach can be additionally significantly enhanced on the off chance that it depends on …altered residuals got through instability models which take into account skewed appropriations of return developments.
best out-of-test instability gauging capacity which we use as the foundation for the determination of the most effective model from among the choices. In this study, six mimicked examines in GARCH (p,q) with six distinctive mistake circulations are completed [7].The Generalized Autoregressive Conditional Heteroskedasticity (GARCH) model, intended to display instability bunching, displays overwhelming tailedness paying little heed to the appropriation of on its development term. While applying the model to money related time arrangement, the conveyance of advancements plays an imperative part for risk estimation and option valuing [8].
Financial related returns arrangements are essentially described by having a zero mean, showing high kurtosis and little, if any, connection. The squares of these profits frequently present high connection and perseverance, which makes ARCH-sort models suitable for evaluating the restrictive unpredictability of such procedures; see Engle (1982) for the original work, Bollerslev et al (1994) for a review on instability models and Engle and Patton (2001) for a few expansions. The ARCH parameters are typically assessed utilizing most extreme probability (ML) techniques that are ideal when the information is drawn from a Gaussian circulation [9].
This paper looks at the anticipating execution of four GARCH (1, 1) models (GARCH, EGARCH, GJR and APARCH) utilized with three dispersions (Normal, Student-t and Skewed Student-t). They investigate and look at changed conceivable wellsprings of conjectures upgrades: asymmetry in the contingent difference, fat-followed conveyances and skewed appropriations. Two noteworthy European stock records (FTSE 100 and DAX 30) are considered utilizing day by day information over a 15-years time span. Our outcomes propose that enhancements of the general estimation are accomplished when topsy-turvy GARCH are utilized and when fat-followed densities are checked in the contingent change. Also, it is found that GJR and APARCH give preferred figures over symmetric GARCH. At long last expanded execution of the estimates is not obviously watched when utilizing non-ordinary conveyances [10]. This paper breaks down the system, results and exactness of GARCH (1, 1) models used with three appropriations (Normal, Student-t and Skewed Student-t). They examine and contrast different conveyances with get high determining precision through rolling out improvements in asymmetry restrictive change, skew and fat followed circulations. Two vital European stock records (FTSE 100 and DAX 30) are examined using each day data over a 15-years time span. Our results suggest that improvements of the general estimation are expert when hilter kilter GARCH are used and when fat-took after densities are considered in the prohibitive change. Likewise, it is found that GJR and APARCH give favored guesses over symmetric GARCH. Finally extended execution of the gages is not clearly watched while using non-common dispersals [11].
This paper thinks about 330 ARCH-sort models as far as their capacity to portray the restrictive difference [12]. The models are looked at out-of-test utilizing DM-$ swapping scale information and IBM return information, where the last depends on another information set of acknowledged change. We discover no confirmation that a GARCH (1, 1) is beated by more refined models in our investigation of trade rates, while the GARCH (1, 1) is obviously second rate compared to models that can oblige an influence impact in our examination of IBM returns. The models are contrasted and the test for unrivaled prescient capacity (SPA) and the rude awakening for information snooping (RC). Our observational results demonstrate that the RC needs energy to a degree that makes it not able to recognize "great" and "awful" models in their investigation [12].

VOLATILITY
In finance, Volatility is defines level of variety of an exchanging trade prices after some time as measured by the standard deviation of profits. Historic volatility is gotten from time arrangement of past business sector costs. A suggested unpredictability is gotten from the business sector cost of a business sector exchanged subsidiary (specifically a choice). The image σ is utilized for unpredictability, and compares to standard deviation, which ought not be mistaken for the comparatively named difference, which is rather the square, σ 2 .The three principle purposes of estimating Volatility are for knowing risk, allocation of assets, make profit with financial trading. A huge piece of volatility forecasting is measuring the potential future misfortunes of an arrangement of advantages, and keeping in mind the end goal to gauge these potential misfortunes, gauges must be made of future volatilities and relationships. In asset management, the Markowitz methodology of minimizing risk for a given level of expected returns has turned into a standard methodology, and obviously an assessment of the fluctuation covariance network is required to measure volatility. Maybe the most difficult use of forecasting volatility is to utilize risk factors for building up a risk oriented return model.

METHODOLOGY
The methodology can be split into 4 steps such as Data Acquisition, Data Preprocessing, Estimation of Volatility, Forecasting using GARCH Techniques and Result Comparison. Data acquisition is the first step, the stock market closing. The detailed explanation of steps is explained below.
Data Acquisition -this paper uses 10 years of stock market data for volatility forecasting. We have selected the SP 500 index as the input dataset. SP 500 end of the day stock data is downloaded from Yahoo Finance. This paper uses 10 years of historical data ranging from 5 th December 2005 to 4 th December 2015. 10 years of data set resulted in 2514 samples of data set. One row of data is generated per day except on Saturday and Sunday as their will be no transactions on weekends.

Data Preprocessing
The downloaded data consists of 6 columns such as Date, Open, High, Low, Close, Volume and Adjusted Closing prices. The data when downloaded is in the recent date first order, the data is arranged to contain recent date at the last to predict the volatility values for net 10 dates. The data is checked for missing values or NA values, such data will be either replaced with mean or median or deleted.

Estimating Volatility
The volatility is estimated from open, high, low and close values of stock data; generally volatility is calculated as the standard deviation or returns of stock data. The volatility is calculated by considering every 10 days as interval of stock data. Close Method is the most commonly used volatility calculation technique. This method works on closing prices of the stock data. The plot of estimated volatility for SP 500 using Close price is as shown in Fig.1 Volatility Close = cc = ∑ ……………………………………………….…….. (1) Forecasting of Volatility -The volatility estimated from close technique is forecasted using Garch technique. In this paper we apply Garch with different distribution models in order to forecast accurately. The volatility is forecasted for 10 days in advance.

Result Comparison
The results of Garch technique with different models are compared with error measuring parameter MSE. The distribution model with lowest MSE value is considered as the most accurate distribution model compared to other models. The main aim of this research is to find out a distribution model with lowest error. Figure 1. Volatility of SP 500 Index for a period of 10 years

GARCH TECHNIQUES
Generally GARCH is referred as Generalized Autoregressive Conditional Heteroskedasticity model, intended for volatility is clustering and displays heavy-tailedness depending upon the selection of innovation term. While applying Garch model to forecast volatility, the distribution of innovation terms plays a critical part for risk estimation and option pricing. GARCH models have been created to clarify the unpredictability grouping. In the GARCH model, the development (or remaining) conveyances are thought to be a standard typical dispersion, regardless of the way that this presumption is frequently dismisses experimentally. Consequently, GARCH models with nonordinary advancement dissemination have been produced. In this research Garch techniques on applying different innovation models to conclude about a better forecasting model. Financial models with long tailed distributions and volatility clustering have been acquainted to overcome issues with the authenticity of traditional Garch models. These traditional models of financial time series lack the explanation of homoskedasticity, skewness, substantial tails, and instability grouping of empirical asset returns.

Skew Normal distribution
In probability theory and statistics, the skew normal distribution is a continuous probability distribution that generalizes the normal distribution to allow for non-zero skewness.

Skew Student distribution
The Normal, Student and GED distributions have skew variants which have been standardized to zero mean, unit variance by making use of the moment conditions given above.

Normal Inverse Gaussian distribution
The normal-inverse Gaussian distribution (NIG) is a continuous probability distribution that is defined as the normal variance-mean mixture where the mixing density is the inverse Gaussian distribution. The NIG distribution was noted by Blaesild in 1977 as a subclass of the generalized hyperbolic distribution discovered by Ole Barndorff-Nielsen, in the next year Barndorff-Nielsen published the NIG in another paper. It was introduced in the mathematical finance literature in 1997.
The Inverse Gaussian distribution are controlled by the location, and , their relation is refereed as The probability density function is given by Where Ki denotes Bessel function of third kind

Generalized Hyperbolic distribution
The General Hyperbolic distribution was popularized by Aas and Ha (2006) because of its uniqueness in the GH family in having g one tail with polynomial and one with exponential behavior. This distribution is a limiting case of the GH when | and where v is the shape parameter of the Student distribution. The domain of variation of the parameters is R and v> 0, but for the variance to be infinite v> 4, while for the existence of skewness and kurtosis, v > 6 and v> 8 respectively. The density of the random variable x is then given by:

Johnson's reparametrized SU distribution
The reparametrized Johnson SU distribution, discussed in Rigby and Stasinopoulos (2005), is a four parameter distribution denoted by JSU ( ), with mean and standard deviation for all values of the skew and shape parameters and respectively.
The probability density function is given by

Results
This research mainly focused on exploring Garch techniques with different distribution models. Garch techniques were tested with 9 different distribution models on the same data set to forecast 10 days in advance. The future 10 forecasted values are tabulated in    [17] compares various GARCH techniques and the observation according to that paper are as follows: GED-GARCH model is better than t-GARCH, and t-GARCH is better than N-GARCH. Yiannis Dendramis [5] the GARCH model performance in stocks suggests that the skewed t-student and GED distributions constitute excellent tools in modeling distribution features of asset returns. According to Abu Hassan [18] [2009] suggest among GARCH with norm and s-norm and t-norm distributions t norms outperforms the other distribution models.

Results Comparison
The results in this paper also suggests that the GARCH model with GED distributions have minimal mean square errors and it outperforms other GARCH distribution models.

Summary
The aim of this research work is to forecast volatility with high accuracy using different distributions of Garch techniques. This paper uses SP 500 indices stock market end of the day data for a period of 10 years for volatility forecasting. The volatility was calculated using standard deviation of returns over period of time. The volatility was given as input for Garch techniques with different distribution parameter. The research work uses 9 Garch distribution models that are used to forecast the volatility. The different GARCH distribution models used in this paper are Std, Norm, SNorm, GED, SSTD, SGED, NIG, GHYP and JSU. Future values of Volatility are forecasted for 10 days in advance and values are compared with the actual values to find out the best distribution model. Based on the results obtained it has been observed that GARCH with GED distribution models predicts volatility with least error compared to other models. The future work of this research can be the application of Hybrid distribution models to forecast volatility. The Hybrid distribution models may involve combining two or more techniques to improve the results.