What is time analysis and forecast

Time series analysis

As a time series {xt} a set of observations on a statistical feature X is designated in time sequence. The time reference can be selectively such as B with the hourly electricity price of an energy wholesaler or interval-related such as B. be the annual electrical energy consumption of a household.

 

Image: Hourly prices (cents per KWh) for seven sales areas of an electricity wholesaler

 

The analysis of a time series is intended to reveal dynamic regularities in the course of a characteristic or the interaction between several characteristics. It is related to regression analysis when the time is interpreted as the regressor and the feature X to be examined is interpreted as the regression.

The methods of time series analysis can be differentiated according to deterministic or stochastic, univariate or multivariate approaches. Each of these approaches includes model classes and corresponding structural specification techniques.

A deterministic approach describes the changes in the mean (trend), possible cyclical fluctuations, annual seasonal fluctuations or calendar effects with the help of functions that are usually superimposed additively as a component model in an equation (decomposition). With the help of moving averages or moving medians, individual components can be filtered out of the approach (smoothing). With a calendar and seasonal adjustment, the corresponding components are calculated out in order to track the trend e.g. B. to be able to recognize better in labor market data. The least squares method is a simple criterion for choosing a component model that is adequate for the data.

A stochastic approach understands the data as the realization of one or more adequate random processes and examines their regularities. Before the actual modeling, hidden periodicities have to be uncovered by means of spectral analysis [Schlittgen 2001, p. 353 ff.]. With the help of unit root tests, trend stationarity is checked and a suitable Box-Cox transformation is carried out in the event of a time-dependent variance.

If trend, seasonal and calendar effects can be removed by forming differences, the remaining residual component can be subjected to an autocorrelation analysis, which provides information about typical reactions to random shocks such as B. weather and price turbulence. Frequently used, parameter-saving model processes are of the ARIMA (Autoregressive Integrated Moving Average) type and consist of an autoregressive component with weighted time-delayed observations and a lubricant component with weighted time-delayed shocks. The Box-Jenkins technique [Götze 2010, p. 199] is mostly used to specify the difference equation. The parameters are estimated using gradient methods.

In order to avoid upstream transformations of the data, multi-equation models with separate model processes are sometimes used for level, trend and seasonal dynamics. The individual model processes can, such as In UCM (Unobserved Components Models), for example, it can be structured almost parameter-free, which significantly reduces the identification effort [Brockwell 2002, p. 259 ff.]. This approach can be expanded with deterministic components such as calendar effects and intervention terms, but also with autoregressive terms.

In the case of time-variable volatility, a variance equation of the type GARCH (General Autoregressive Heteroskedasticity) can be included in addition to the actual observation equation, which contains time-delayed weighted variances and squared shocks [Hill 2008, p. 363 ff.]. In this way, dynamic confidence intervals for one-step forecasts can be constructed.

A multivariate stochastic approach is used to investigate presumed interactions between several model processes, e.g. B. between the daily average temperature and the energy consumption of a cold store. Cross-spectral and cross-correlation analyzes are used for this purpose. Model equation systems of the type VAR (vector autoregression), which enable a simultaneous description of time series bundles, but contain a large number of hardly interpretable model parameters [Hill 2008, p. 346 ff.], Are widespread.

In the case of several non-stationary model processes, it can make sense to examine linear combinations of time series in order to map the interplay of long-term and short-term dynamics (cointegration). Difference equations describe the short-term changes in each individual variable (error correction equations) and include a regression relationship between the variables that stands for the long-term equilibrium [Hill 2008, p. 339].

When modeling time series, normal processes in particular guarantee optimal statistical parameter estimates. With the help of a logarithmic data transformation, the skewness can often be reduced considerably and the assumption of normal distribution can be supported. In the case of a kernel density estimation, the empirical frequency distribution is used instead of an assumed probability distribution. Two assumptions have to be made about the weight function (core density function) and the smoothing parameter (bandwidth). An optimal bandwidth can be determined for the widespread Gaussian kernel [Hildebrandt 2009, p. 76 ff.].

literature

Brockwell, P. J .; Davis, R. A .: Introduction to Time Series and Forecasting. 2nd edition, Springer, 2002.

Götze, W .: Techniques of Business Forecasting. Oldenbourg, 2nd edition 2010.

Hildebrandt, J .: Non-parametric integrated return and risk forecasts in asset management with the help of predictor selection methods, Göttingen, Cuvillier, 2009 (Diss.).

Hill, R. Carter, William E. Griffiths, Guay C. Lim: Principles of Econometrics. 3rd Edition, John Wiley & Sons Inc. 2008.

Schlittgen, R .; Streitberg, B. H. J .: Time series analysis. 9th edition, Oldenbourg, 2001.

 

author


Prof. Dr. Wolfgang Götze, Stralsund University of Applied Sciences, teaching area dedication: Mathematics, Statistics, Operations Research and Computer Science, Zur Schwedenschanze 15, House 21, Room 312, 18435 Stralsund

Author info


Item Actions