ArmaFft.fftar()

statsmodels.sandbox.tsa.fftarma.ArmaFft.fftar ArmaFft.fftar(n=None) [source] Fourier transform of AR polynomial, zero-padded at end to n Parameters: n : int length of array after zero-padding Returns: fftar : ndarray fft of zero-padded ar polynomial

static QuantRegResults.tvalues()

statsmodels.regression.quantile_regression.QuantRegResults.tvalues static QuantRegResults.tvalues() Return the t-statistic for a given parameter estimate.

IVRegressionResults.initialize()

statsmodels.sandbox.regression.gmm.IVRegressionResults.initialize IVRegressionResults.initialize(model, params, **kwd)

static CountResults.llnull()

statsmodels.discrete.discrete_model.CountResults.llnull static CountResults.llnull()

robust.scale.stand_mad()

statsmodels.robust.scale.stand_mad statsmodels.robust.scale.stand_mad(a, c=0.67448975019608171, axis=0) [source]

static ProbitResults.llr_pvalue()

statsmodels.discrete.discrete_model.ProbitResults.llr_pvalue static ProbitResults.llr_pvalue()

OLSInfluence.summary_frame()

statsmodels.stats.outliers_influence.OLSInfluence.summary_frame OLSInfluence.summary_frame() [source] Creates a DataFrame with all available influence results. Returns: frame : DataFrame A DataFrame with all results. Notes The resultant DataFrame contains six variables in addition to the DFBETAS. These are: cooks_d : Cook?s Distance defined in Influence.cooks_distance standard_resid : Standardized residuals defined in Influence.resid_studentized_internal hat_diag : The diagonal of the

SimpleTable.extend_right()

statsmodels.iolib.table.SimpleTable.extend_right SimpleTable.extend_right(table) [source] Return None. Extend each row of self with corresponding row of table. Does not import formatting from table. This generally makes sense only if the two tables have the same number of rows, but that is not enforced. :note: To extend append a table below, just use extend, which is the ordinary list method. This generally makes sense only if the two tables have the same number of columns, but that is not e

Kernel Density Estimation

Kernel Density Estimation Link to Notebook GitHub In [1]: import numpy as np from scipy import stats import statsmodels.api as sm import matplotlib.pyplot as plt from statsmodels.distributions.mixture_rvs import mixture_rvs A univariate example. In [2]: np.random.seed(12345) In [3]: obs_dist1 = mixture_rvs([.25,.75], size=10000, dist=[stats.norm, stats.norm], kwargs = (dict(loc=-1,scale=.5),dict(loc=1,scale=.5))) In [4]: kde = sm.non

GLS.whiten()

statsmodels.regression.linear_model.GLS.whiten GLS.whiten(X) [source] GLS whiten method. Parameters: X : array-like Data to be whitened. Returns: np.dot(cholsigmainv,X) : See also regression.GLS