Dates in timeseries models

Dates in timeseries models Link to Notebook GitHub In [1]: from __future__ import print_function import statsmodels.api as sm import numpy as np import pandas as pd Getting started In [2]: data = sm.datasets.sunspots.load() Right now an annual date series must be datetimes at the end of the year. In [3]: from datetime import datetime dates = sm.tsa.datetools.dates_from_range('1700', length=len(data.endog)) Using Pandas Make a pandas TimeSeries

CountResults.t_test()

statsmodels.discrete.discrete_model.CountResults.t_test CountResults.t_test(r_matrix, cov_p=None, scale=None, use_t=None) Compute a t-test for a each linear hypothesis of the form Rb = q Parameters: r_matrix : array-like, str, tuple array : If an array is given, a p x k 2d array or length k 1d array specifying the linear restrictions. It is assumed that the linear combination is equal to zero. str : The full hypotheses to test can be given as a string. See the examples. tuple : A tuple of

Nonparametric Methods nonparametric

Nonparametric Methods nonparametric This section collects various methods in nonparametric statistics. This includes kernel density estimation for univariate and multivariate data, kernel regression and locally weighted scatterplot smoothing (lowess). sandbox.nonparametric contains additional functions that are work in progress or don?t have unit tests yet. We are planning to include here nonparametric density estimators, especially based on kernel or orthogonal polynomials, smoothers, and tool

GMM.fit()

statsmodels.sandbox.regression.gmm.GMM.fit GMM.fit(start_params=None, maxiter=10, inv_weights=None, weights_method='cov', wargs=(), has_optimal_weights=True, optim_method='bfgs', optim_args=None) [source] Estimate parameters using GMM and return GMMResults TODO: weight and covariance arguments still need to be made consistent with similar options in other models, see RegressionResult.get_robustcov_results Parameters: start_params : array (optional) starting value for parameters ub minimiza

static QuantRegResults.resid_pearson()

statsmodels.regression.quantile_regression.QuantRegResults.resid_pearson static QuantRegResults.resid_pearson() Residuals, normalized to have unit variance. Returns: An array wresid/sqrt(scale) :

GLS.fit_regularized()

statsmodels.regression.linear_model.GLS.fit_regularized GLS.fit_regularized(method='coord_descent', maxiter=1000, alpha=0.0, L1_wt=1.0, start_params=None, cnvrg_tol=1e-08, zero_tol=1e-08, **kwargs) Return a regularized fit to a linear regression model. Parameters: method : string Only the coordinate descent algorithm is implemented. maxiter : integer The maximum number of iteration cycles (an iteration cycle involves running coordinate descent on all variables). alpha : scalar or array-

LogitResults.f_test()

statsmodels.discrete.discrete_model.LogitResults.f_test LogitResults.f_test(r_matrix, cov_p=None, scale=1.0, invcov=None) Compute the F-test for a joint linear hypothesis. This is a special case of wald_test that always uses the F distribution. Parameters: r_matrix : array-like, str, or tuple array : An r x k array where r is the number of restrictions to test and k is the number of regressors. It is assumed that the linear combination is equal to zero. str : The full hypotheses to test ca

MixedLM.predict()

statsmodels.regression.mixed_linear_model.MixedLM.predict MixedLM.predict(params, exog=None, *args, **kwargs) After a model has been fit predict returns the fitted values. This is a placeholder intended to be overwritten by individual models.

MNLogit.pdf()

statsmodels.discrete.discrete_model.MNLogit.pdf MNLogit.pdf(eXB) [source] NotImplemented

tools.tools.recipr()

statsmodels.tools.tools.recipr statsmodels.tools.tools.recipr(X) [source] Return the reciprocal of an array, setting all entries less than or equal to 0 to 0. Therefore, it presumes that X should be positive in general.