Time Series analysis tsa
¶
statsmodels.tsa
contains model classes and functions that are useful
for time series analysis. Basic models include univariate autoregressive models (AR),
vector autoregressive models (VAR) and univariate autoregressive moving average models
(ARMA). Non-linear models include Markov switching dynamic regression and
autoregression. It also includes descriptive statistics for time series, for example autocorrelation, partial
autocorrelation function and periodogram, as well as the corresponding theoretical properties
of ARMA or related processes. It also includes methods to work with autoregressive and
moving average lag-polynomials.
Additionally, related statistical tests and some useful helper functions are available.
Estimation is either done by exact or conditional Maximum Likelihood or conditional least-squares, either using Kalman Filter or direct filters.
Currently, functions and classes have to be imported from the corresponding module, but the main classes will be made available in the statsmodels.tsa namespace. The module structure is within statsmodels.tsa is
stattools : empirical properties and tests, acf, pacf, granger-causality, adf unit root test, kpss test, bds test, ljung-box test and others.
ar_model : univariate autoregressive process, estimation with conditional and exact maximum likelihood and conditional least-squares
arima_model : univariate ARMA process, estimation with conditional and exact maximum likelihood and conditional least-squares
statespace : Comprehensive statespace model specification and estimation. See the statespace documentation.
vector_ar, var : vector autoregressive process (VAR) and vector error correction models, estimation, impulse response analysis, forecast error variance decompositions, and data visualization tools. See the vector_ar documentation.
kalmanf : estimation classes for ARMA and other models with exact MLE using Kalman Filter
arma_process : properties of arma processes with given parameters, this includes tools to convert between ARMA, MA and AR representation as well as acf, pacf, spectral density, impulse response function and similar
sandbox.tsa.fftarma : similar to arma_process but working in frequency domain
tsatools : additional helper functions, to create arrays of lagged variables, construct regressors for trend, detrend and similar.
filters : helper function for filtering time series
regime_switching : Markov switching dynamic regression and autoregression models
Some additional functions that are also useful for time series analysis are in other parts of statsmodels, for example additional statistical tests.
Some related functions are also available in matplotlib, nitime, and scikits.talkbox. Those functions are designed more for the use in signal processing where longer time series are available and work more often in the frequency domain.
Descriptive Statistics and Tests¶
|
Autocovariance for 1D |
|
Autocorrelation function for 1d arrays. |
|
Partial autocorrelation estimated |
|
Partial autocorrelation estimated with non-recursive yule_walker |
|
Calculate partial autocorrelations via OLS |
|
Burg’s partial autocorrelation estimator |
|
crosscovariance for 1D |
|
cross-correlation function for 1d |
Returns the periodogram for the natural frequency of X |
|
|
Augmented Dickey-Fuller unit root test |
|
Kwiatkowski-Phillips-Schmidt-Shin test for stationarity. |
|
Test for no-cointegration of a univariate equation |
|
Calculate the BDS test statistic for independence of a time series |
|
Return’s Ljung-Box Q Statistic |
|
four tests for granger non causality of 2 timeseries |
|
Levinson-Durbin recursion for autoregressive processes |
|
Innovations algorithm to convert autocovariances to MA parameters |
|
Filter observations using the innovations algorithm |
|
Levinson-Durbin algorithm that returns the acf and ar coefficients |
|
Returns information criteria for many ARMA models |
|
Perform automatic seaonal ARIMA order identification using x12/x13 ARIMA. |
|
Perform x13-arima analysis for monthly or quarterly data. |
Estimation¶
The following are the main estimation classes, which can be accessed through statsmodels.tsa.api and their result classes
Univariate Autogressive Processes (AR)¶
|
Autoregressive AR(p) model |
|
Class to hold results from fitting an AR model. |
Autogressive Moving-Average Processes (ARMA) and Kalman Filter¶
The basic ARIMA model and results classes that should be the starting point for for most users are:
|
Autoregressive Moving Average ARMA(p,q) Model |
|
Class to hold results from fitting an ARMA model. |
|
Autoregressive Integrated Moving Average ARIMA(p,d,q) Model |
|
Methods |
Some advanced underlying low-level classes and functions that can be used to compute the log-likelihood function for ARMA-type models include (note that these are rarely needed by end-users):
Kalman Filter code intended for use with the ARMA model. |
|
Compute innovations using a given ARMA process |
|
Compute loglikelihood of the given data assuming an ARMA process |
|
Compute loglikelihood for each observation assuming an ARMA process |
|
Compute the score (gradient of the loglikelihood function) |
|
Compute the score per observation (gradient of the loglikelihood function) |
Exponential Smoothing¶
|
Holt Winter’s Exponential Smoothing |
Simple Exponential Smoothing |
|
|
Holt’s Exponential Smoothing |
|
Holt Winter’s Exponential Smoothing Results |
ARMA Process¶
The following are tools to work with the theoretical properties of an ARMA process for given lag-polynomials.
|
Theoretical properties of an ARMA process for specified lag-polynomials |
|
Find arma approximation to ar process |
|
Get the AR representation of an ARMA process |
|
Get the MA representation of an ARMA process |
|
Theoretical autocorrelation function of an ARMA process |
|
Theoretical autocovariance function of ARMA process |
|
Generate a random sample of an ARMA process |
Get the impulse response function (MA representation) for ARMA process |
|
|
Partial autocorrelation function of an ARMA process |
|
Periodogram for ARMA process given by lag-polynomials ar and ma |
|
Deconvolves divisor out of signal, division of polynomials for n terms |
|
Expand coefficients to lag poly |
Remove zeros from lag polynomial |
|
|
AR representation of fractional integration |
|
MA representation of fractional integration |
return coefficients for seasonal difference (1-L^s) |
|
fft tools for arma processes |
Statespace Models¶
See the statespace documentation..
Vector ARs and Vector Error Correction Models¶
See the vector_ar documentation..
Regime switching models¶
First-order k-regime Markov switching regression model |
|
|
Markov switching regression model |
Time Series Filters¶
|
Baxter-King bandpass filter |
|
Hodrick-Prescott filter |
|
Christiano Fitzgerald asymmetric, random walk filter |
Linear filtering via convolution. |
|
|
Autoregressive, or recursive, filtering. |
|
use nd convolution to merge inputs, then use lfilter to produce output |
|
Convolve two N-dimensional arrays using FFT. |
|
Convolve two N-dimensional arrays using FFT. |
|
Seasonal decomposition using moving averages |
TSA Tools¶
|
Adds a trend and/or constant to an array. |
|
Detrend an array with a trend of given order along axis 0 or 1 |
|
Create 2d array of lags |
|
Generate lagmatrix for 2d array, columns arranged by variables |
VARMA Process¶
|
class to keep track of Varma polynomial format |
Interpolation¶
|
Modified Denton’s method to convert low-frequency to high-frequency data. |