DiscreteModel.predict()

statsmodels.discrete.discrete_model.DiscreteModel.predict DiscreteModel.predict(params, exog=None, linear=False) [source] Predict response variable of a model given exogenous variables.

static BinaryResults.fittedvalues()

statsmodels.discrete.discrete_model.BinaryResults.fittedvalues static BinaryResults.fittedvalues()

stats.power.GofChisquarePower()

statsmodels.stats.power.GofChisquarePower class statsmodels.stats.power.GofChisquarePower(**kwds) [source] Statistical Power calculations for one sample chisquare test Methods plot_power([dep_var, nobs, effect_size, ...]) plot power with number of observations or effect size on x-axis power(effect_size, nobs, alpha, n_bins[, ddof]) Calculate the power of a chisquare test for one sample solve_power([effect_size, nobs, alpha, ...]) solve for any one parameter of the power of a one sample ch

sandbox.stats.multicomp.tiecorrect()

statsmodels.sandbox.stats.multicomp.tiecorrect statsmodels.sandbox.stats.multicomp.tiecorrect(xranks) [source] should be equivalent of scipy.stats.tiecorrect

QuantReg.initialize()

statsmodels.regression.quantile_regression.QuantReg.initialize QuantReg.initialize()

tsa.filters.filtertools.fftconvolveinv()

statsmodels.tsa.filters.filtertools.fftconvolveinv statsmodels.tsa.filters.filtertools.fftconvolveinv(in1, in2, mode='full') [source] Convolve two N-dimensional arrays using FFT. See convolve. copied from scipy.signal.signaltools, but here used to try out inverse filter doesn?t work or I can?t get it to work 2010-10-23: looks ok to me for 1d, from results below with padded data array (fftp) but it doesn?t work for multidimensional inverse filter (fftn) original signal.fftconvolve also uses f

tools.eval_measures.bic_sigma()

statsmodels.tools.eval_measures.bic_sigma statsmodels.tools.eval_measures.bic_sigma(sigma2, nobs, df_modelwc, islog=False) [source] Bayesian information criterion (BIC) or Schwarz criterion Parameters: sigma2 : float estimate of the residual variance or determinant of Sigma_hat in the multivariate case. If islog is true, then it is assumed that sigma is already log-ed, for example logdetSigma. nobs : int number of observations df_modelwc : int number of parameters including constant

GLMResults.wald_test()

statsmodels.genmod.generalized_linear_model.GLMResults.wald_test GLMResults.wald_test(r_matrix, cov_p=None, scale=1.0, invcov=None, use_f=None) Compute a Wald-test for a joint linear hypothesis. Parameters: r_matrix : array-like, str, or tuple array : An r x k array where r is the number of restrictions to test and k is the number of regressors. It is assumed that the linear combination is equal to zero. str : The full hypotheses to test can be given as a string. See the examples. tuple :

WLS.fit_regularized()

statsmodels.regression.linear_model.WLS.fit_regularized WLS.fit_regularized(method='coord_descent', maxiter=1000, alpha=0.0, L1_wt=1.0, start_params=None, cnvrg_tol=1e-08, zero_tol=1e-08, **kwargs) Return a regularized fit to a linear regression model. Parameters: method : string Only the coordinate descent algorithm is implemented. maxiter : integer The maximum number of iteration cycles (an iteration cycle involves running coordinate descent on all variables). alpha : scalar or array-

CompareMeans.dof_satt()

statsmodels.stats.weightstats.CompareMeans.dof_satt CompareMeans.dof_satt() [source] degrees of freedom of Satterthwaite for unequal variance