next up previous
Next: Two examples. Up: Basic Data Analysis of Previous: Basic Data Analysis of

Overview of time series analysis (TSA).

Whether or goal in analyzing a particular time series is forecasting or understanding the underlying mechanism, we will be building a probability model for the data. The general end result of probability modeling is a method for ``reducing'' the series to some kind of standard ``random noise''. The point is that when we have such ``noise'' we have extracted all useful information. For forecasting, the utility of the ``reduction to random noise'' notion is that ``noise'' cannot be predicted except with a probability statement usually in the form of a so-called prediction interval. We can then reverse the ``reduction to random noise'' procedure to obtain a prediction interval for the original series. When it comes to understanding the mechanism that generates the series, the notion is that ``noise'' is not understandable, so all the useful information is in the mechanism whereby the process is reduced to noise.

So, two big issues are evident: what tools are available to develop the ``reduction to noise'' and how do we recognize ``noise'' when we see it?

We shall see that there are three typical steps in the ``reduction-to-noise'' process:

(i)
A data transformation such as taking logarithms of the data.
(ii)
Removing seasonality and trend to obtain a stationary process.
(iii)
Fit a standard time series model (generally a so-called Auto-Regressive Moving Average or ARMA model).

Construction of the ``reduction-to-noise'' procedure does not always proceed in a linear fashion (e.g. determine the transformation in (i), remove seasonality and trend in (ii), and then fit an ARMA model). One will usually jump around from one attempt after another of trying to develop each of the three components, assessing the results of each attempt, backing up, jumping ahead, etc.



Dennis Cox
Thu Jan 16 12:20:07 CST 1997