|Quick guide Site map|
|Journals Home | Journals List | EJs Extra | This Journal | Search | Authors | Referees | Librarians | User Options | Help ||
|| | | ||
Alireza Javaheri 2004 Quantitative Finance 4 C16
Estimation and statistical inference are central to many disciplines. Whatever your role, all you have is a time series of observations, but you need to draw conclusions, estimate parameters and make forecasts based on this time series.
Many aspects of time-series analysis originally came from electrical engineering—an electrical engineer has to process a noisy signal and extract information from it to build a working system.
Before discussing various models, we need to understand what noise is. The concept of white noise corresponds to a series of uncorrelated random variables with zero mean and a given finite variance.
Linear models were developed first. The autoregressive (AR) and moving-average (MA) models are perhaps the best known. Their combination, ARMA, covers many different practical cases. Why are linear models so popular? Mainly because they are analytically tractable. For a Gaussian (normal) noise, the linear framework is particularly convenient, since any linear combination of Gaussians will remain Gaussian. Knowing that a normal distribution can be entirely characterized via its first two moments, this makes the estimation process easier. Another useful property of the above time series is their being (second-order) stationary, meaning that the mean and variance are constant in time and the auto-covariance depends on the time lag only.
Unfortunately, most systems are nonlinear, whether we are dealing with signal processing, navigation, biology or finance. So even starting with a Gaussian white noise, the elegant properties of tractability are soon lost and we are dealing with much more complicated systems.
The best-known example in finance is heteroskedasticity. This means that when we observe the time series, the (conditional) variance is nonconstant and in fact random. Hence the concept of conditionally heteroskedastic AR processes, referred to as ARCH by Robert Engle. A generalization of this model (GARCH) has more tractable properties. Any first-order GARCH model can be written as an AR model of infinite order. This is why the GARCH model has such good fitting properties. We need only three parameters to describe a time series under a GARCH model; we need more to fit the same time series via an AR or ARCH model. This parsimonious feature is one of the attractive properties of the GARCH processes.
Another well-known nonlinear example is that of threshold AR processes, where the AR equation depends on a lagged variable being above or below a given threshold. This is similar to a jump diffusion process where the mean can vary depending on whether a jump occurs or not.
Two main approaches exist to deal with a given time series: the parametric one, where we assume that the distribution has a certain form and estimate the parameters; and the nonparametric one, where we loosen the assumptions and estimate the functional form of the system.
The globe according to GARCH
The previous example of GARCH belongs to the parametric case, where we estimate the parameters of the volatility equation by applying least-square estimation or maximization of likelihood. In a way the nonparametric case is a generalization of this where we estimate the volatility function from the data. To do this we need to apply a regression analysis via a kernel (density) function and a chosen bandwidth. Local linear approximations and the use of polynomials are one way to deal with this system. Global spline approximations are another, where a global interpolation scheme is used to approximate the functional forms used in the state-space equations.
Fan and Yao's book has a lot to offer. First, it is readable, even by those with limited knowledge of time-series analysis, as the authors spend time on all the basic concepts. Second, it is self-contained so you do not need other books to understand it. Third, it contains many examples and illustrations to explain the intuition behind the concepts. Fourth, it is up to date and has the latest cutting-edge methods to handle nonlinear time series.
For example, in chapter 4, second section, the authors deal with ARCH and GARCH models. They start with formal definitions of these processes then provide theorems on the conditions of having stationary ARCH/GARCH processes. They give references on the proofs of these theorems and reproduce some of them at the end of the chapter. Some are very recent, such as a theorem by Giraitis, Kokoszka and Leipus, which was proved in 2000. After dealing with these theoretical points they tackle more practical questions, such as estimation algorithms via maximum likelihood methods under Gaussian as well as more general assumptions. They take real-world examples from the financial world and show how GARCH assumptions on currency or equity index time series translate into volatility clustering and asymmetry. They study different kinds of GARCH models then perform diagnostic tests and graphical investigations. Finally they explain the limitations of GARCH models and their differences from other more general stochastic volatility models. All this is done in 30 pages, which is truly impressive.
Other chapters deal with other concepts with the same detail-oriented style. From spectral density to polynomial approximations, from auto-correlation functions to the choice of optimal bandwidth, from ergodicity to chaos theory . . . very little, if anything, is left out.
Weaknesses? None really, except that, as the authors state in the introduction, the focus of the book remains on nonparametric techniques. So if you want to know everything about Nonlinear Time Series you will also need a book dealing with parametric concepts. Which one? Let's leave that to another review!
RBC Capital Markets, Royal Bank of Canada, One Liberty Plaza, 165 Broadway, New York, NY 10006, USA
|| | ||
Setup information is available for Adobe Acrobat.
EndNote, ProCite ® and Reference Manager ® are registered trademarks of ISI Researchsoft.
Copyright © Institute of Physics and IOP Publishing Limited 2004.
Use of this service is subject to compliance with the terms and conditions of use. In particular, reselling and systematic downloading of files is prohibited.
Help: Cookies | Data Protection.