skip to content

Keynes Fund

Summary of Project Results

The aim of the project was to study the statistical properties of new methods for modelling volatility and heavy-tailed distributions. Implementation of the models required writing new computer software.

High volatility can damage economic growth in that it creates uncertainty. It can also be an indication of deep-seated problems in the way markets operate. Understanding and modelling volatility is therefore important for managing and regulating financial markets. As such it is relevant for the Keynes Fund in that it has implications for the interface between ‘best private sector practice and public policy’.

Extreme observations, which can be seen as arising from heavy-tailed distributions, are also a feature of uncertainty. Much of our work has been concerned with the interactions between heavy tails and volatility in financial markets. The performance of standard time series models can often be adversely affected by outliers, whereas our models are, by construction, much more robust. Although much of the work was motivated by financial issues and applied to financial time series, the methodology developed has applications in all areas where nonlinear time series models are used. For example one of our recent working papers (Harvey and Ito) applies the methods to time series on rainfall.

Research Output - Published Papers

Tracking a Changing Copula

Tracking a Changing Copula, Andrew Harvey, Journal of Empirical Finance, Vol. 17(3) pp. 485-500 (2010)

Abstract: 

A copula models the relationships between variables independently of their marginal distributions. When the variables are time series, the copula may change over time. Recursive procedures based on indicator variables are proposed for tracking these changes over time. Estimation of the unknown parameters is by maximum likelihood. When the marginal distributions change, pre-filtering is necessary before constructing the indicator variables on which the recursions are based. This entails estimating time-varying quantiles and a simple method based on time-varying histograms is proposed. The techniques are applied to the Hong Kong and Korean stock market indices. Some interesting and unexpected movements are detected, particularly after the attack on the Hong Kong dollar in 1997

Time Series Models with an EGB2 Conditional Distribution

Time Series Models with an EGB2 Conditional Distribution, Michele Caivano and Andrew Harvey, Journal of Time Series Analysis, Vol. 35(6) pp. 558-571 (2014)

Abstract: 

A time-series model in which the signal is buried in noise that is non-Gaussian may throw up observations that, when judged by the Gaussian yardstick, are outliers. We describe an observation-driven model, based on an exponential generalized beta distribution of the second kind (EGB2), in which the signal is a linear function of past values of the score of the conditional distribution. This specification produces a model that is not only easy to implement but which also facilitates the development of a comprehensive and relatively straightforward theory for the asymptotic distribution of the maximum-likelihood (ML) estimator. Score-driven models of this kind can also be based on conditional t distributions, but whereas these models carry out what, in the robustness literature, is called a soft form of trimming, the EGB2 distribution leads to a soft form of Winsorizing. An exponential general autoregressive conditional heteroscedastic (EGARCH) model based on the EGB2 distribution is also developed. This model complements the score-driven EGARCH model with a conditional t distribution. Finally, dynamic location and scale models are combined and applied to data on the UK rate of inflation.

Filtering with Heavy Tails

Filtering with Heavy Tails, Andrew Harvey and Alessandra Luati, Journal of the American Statistical Association, Vol. 109 pp. 1112-1122 (2014)

Abstract: 

An unobserved components model in which the signal is buried in noise that is non-Gaussian may throw up observations that, when judged by the Gaussian yardstick, are outliers. We describe an observation driven model, based on a conditional Student t-distribution, that is tractable and retains some of the desirable features of the linear Gaussian model. Letting the dynamics be driven by the score of the conditional distribution leads to a specification that is not only easy to implement, but which also facilitates the development of a comprehensive and relatively straightforward theory for the asymptotic distribution of the maximum likelihood estimator. The methods are illustrated with an application to rail travel in the UK. The final part of the article shows how the model may be extended to include explanatory variables.

EGARCH Models with Fat Tails, Skewness and Leverage

EGARCH Models with Fat Tails, Skewness and Leverage, Andrew Harvey and Genaro Sucarrat, Computational Statistics & Data Analysis, Vol. 76 pp. 320-338 (2014)

Abstract: 

An EGARCH model in which the conditional distribution is heavy-tailed and skewed is proposed. The properties of the model, including unconditional moments, autocorrelations and the asymptotic distribution of the maximum likelihood estimator, are set out. Evidence for skewness in a conditional tt-distribution is found for a range of returns series, and the model is shown to give a better fit than comparable skewed-tt GARCH models in nearly all cases. A two-component model gives further gains in goodness of fit and is able to mimic the long memory pattern displayed in the autocorrelations of the absolute values.

Robust Time Series Models with Trend and Seasonal Components

Robust Time Series Models with Trend and Seasonal Components, Michele Caivano, Andrew Harvey and Alessandra Luati, SERIEs : Journal of the Spanish Economic Association, Vol 7, pp. 99-120 (2016)

Abstract: 

We describe observation driven time series models for Student-t and EGB2 conditional distributions in which the signal is a linear function of past values of the score of the conditional distribution. These specifications produce models that are easy to implement and deal with outliers by what amounts to a soft form of trimming in the case of t and a soft form of Winsorizing in the case of EGB2. We show how a model with trend and seasonal components can be used as the basis for a seasonal adjustment procedure. The methods are illustrated with US and Spanish data.

Volatility Modeling with a Generalized t Distribution

Volatility Modeling with a Generalized t Distribution, Andrew Harvey and Rutger-Jan Lange, Journal of Time Series Analysis, Vol. 38(2) pp. 175-190 (2017)

Abstract: 

Exponential generalized autoregressive conditional heteroscedasticity models in which the dynamics of the logarithm of scale are driven by the conditional score are known to exhibit attractive theoretical properties for the t distribution and general error distribution. A model based on the generalized t includes both as special cases. We derive the information matrix for the generalized t and show that, when parameterized with the inverse of the tail index, it remains positive definite in the limit as the distribution goes to a general error distribution. We generalize further by allowing the distribution of the observations to be skewed and asymmetric. Our method for introducing asymmetry ensures that the information matrix reverts to the usual case under symmetry. We are able to derive analytic expressions for the conditional moments of our exponential generalized autoregressive conditional heteroscedasticity model as well as the information matrix of the dynamic parameters. The practical value of the model is illustrated with commodity and stock return data. Overall, the approach offers a unified, flexible, robust, and effective treatment of volatility

Modeling the Interactions between Volatility and Returns using EGARCH‐M

Modeling the Interactions between Volatility and Returns using EGARCH‐M, Andrew Harvey and Rutger-Jan Lange, Journal of Time Series Analysis, Vol. 39(6) pp. 909-919 (2018)

Abstract: 

An EGARCH‐M model, in which the logarithm of scale is driven by the score of the conditional distribution, is shown to be theoretically tractable as well as practically useful. A two‐component extension makes it possible to distinguish between the short‐ and long‐run effects of returns on volatility, and the resulting short‐ and long‐run volatility components are then allowed to have different effects on returns, with the long‐run component yielding the equity risk premium. The EGARCH formulation allows for more flexibility in the asymmetry of the volatility response (leverage) than standard GARCH models and suggests that, for weekly observations on two major stock market indices, the short‐term response is close to being anti‐symmetric.

Maximum Likelihood Estimates for Positive Valued Dynamic Score Models; The DySco Package

Maximum Likelihood Estimates for Positive Valued Dynamic Score Models; The DySco Package, Philipp Andres, Computational Statistics & Data Analysis, Vol 76, pp. 34-42 (2014)

Abstract: 

Recently, the Dynamic Conditional Score (DCS) or Generalized Autoregressive Score (GAS) time series models have attracted considerable attention. This motivates the need for a software package to estimate and evaluate these new models. A straightforward to operate program called the Dynamic Score (DySco) package is introduced for estimating models for positive variables, in which the location/scale evolves over time. Its capabilities are demonstrated using a financial application.

Testing Against Changing Correlation

Testing Against Changing Correlation, Andrew Harvey and Stephen Thiele, Journal of Empirical Finance, Vol 38, pp. 575-89 (2016)

Abstract: 

A test for time-varying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student t-distributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for time-varying volatility in the individual series. Unlike standard moment-based tests, the score-based test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the score-based test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard moment-based test.

Research Output - Working Papers

Circuit Breakers on the London Stock Exchange: Do they improve subsequent market quality?

Circuit Breakers on the London Stock Exchange: Do they improve subsequent market quality?, James Brugler and Oliver Linton, 2014

Abstract: 

This paper uses proprietary data to evaluate the efficacy of single-stock circuit breakers on the London Stock Exchange during July and August 2011. We exploit exogenous variation in the length of the uncrossing periods that follow a trading suspension to estimate the effect of auction length on market quality, measured by volume of trades, frequency of trading and the change in realised variance of returns. We also estimate the effect of a trading suspension in one FTSE-100 stock on the volume of trades, trading frequency and the change in realised variance of returns for other FTSE-100 stocks in the same industrial sector as the event stock and in other sectors. We find that auction length has a significant detrimental effect on market quality for the suspended security when returns are negative but no discernible effect when returns are positive. We also find that trading suspensions help to ameliorate the spread of market microstructure noise and price inefficiency across securities during falling markets but the reverse is true during rising markets. Although trading suspensions may not improve the trading process within a particular security, they do play an important role preventing the spread of poor market quality across securities in falling markets and therefore can be effective tools for promoting market-wide stability.

Modeling Time Series with Zero Observations

Modeling Time Series with Zero Observations, Andrew Harvey and Ryoko Ito (2017), Nuffield College Economics Working Paper 2017-W01, Oxford University

Abstract: 

We consider situations in which a significant proportion of observations in a time series are zero, but the remaining observations are positive and measured on a continuous scale. We propose a new dynamic model in which the conditional distribution of the observations is constructed by shifting a distribution for non-zero observations to the left and censoring negative values. The key to generalizing the censoring approach to the dynamic case is to have (the logarithm of) the location/scale parameter driven by a filter that depends on the score of the conditional distribution.  An exponential link function means that seasonal effects can be incorporated into the model and this is done by means of a cubic spline (which can potentially be time- varying). The model is fitted to daily rainfall in northern Australia and compared with a dynamic zero-augmented model.

Project Information

Project Code: JHLH
Project Investigators
  • Emeritus Professor Andrew Harvey
Research Round
Second Round (March 2013)

Project Investigators

Professor Andrew Harvey is Emeritus Professor of Econometrics at the Faculty of Economics, University of Cambridge and Fellow of the Econometric Society and a Fellow of the British Academy (FBA). His research expertise is in Time Series, Financial Econometrics, State Space Models, Signal Extraction, Volatility.